The AMD FreeSync Review
by Jarred Walton on March 19, 2015 12:00 PM ESTIntroduction to FreeSync and Adaptive Sync
The first time anyone talked about adaptive refresh rates for monitors – specifically applying the technique to gaming – was when NVIDIA demoed G-SYNC back in October 2013. The idea seemed so logical that I had to wonder why no one had tried to do it before. Certainly there are hurdles to overcome, e.g. what to do when the frame rate is too low, or too high; getting a panel that can handle adaptive refresh rates; supporting the feature in the graphics drivers. Still, it was an idea that made a lot of sense.
The impetus behind adaptive refresh is to overcome visual artifacts and stutter cause by the normal way of updating the screen. Briefly, the display is updated with new content from the graphics card at set intervals, typically 60 times per second. While that’s fine for normal applications, when it comes to games there are often cases where a new frame isn’t ready in time, causing a stall or stutter in rendering. Alternatively, the screen can be updated as soon as a new frame is ready, but that often results in tearing – where one part of the screen has the previous frame on top and the bottom part has the next frame (or frames in some cases).
Neither input lag/stutter nor image tearing are desirable, so NVIDIA set about creating a solution: G-SYNC. Perhaps the most difficult aspect for NVIDIA wasn’t creating the core technology but rather getting display partners to create and sell what would ultimately be a niche product – G-SYNC requires an NVIDIA GPU, so that rules out a large chunk of the market. Not surprisingly, the result was that G-SYNC took a bit of time to reach the market as a mature solution, with the first displays that supported the feature requiring modification by the end user.
Over the past year we’ve seen more G-SYNC displays ship that no longer require user modification, which is great, but pricing of the displays so far has been quite high. At present the least expensive G-SYNC displays are 1080p144 models that start at $450; similar displays without G-SYNC cost about $200 less. Higher spec displays like the 1440p144 ASUS ROG Swift cost $759 compared to other WQHD displays (albeit not 120/144Hz capable) that start at less than $400. And finally, 4Kp60 displays without G-SYNC cost $400-$500 whereas the 4Kp60 Acer XB280HK will set you back $750.
When AMD demonstrated their alternative adaptive refresh rate technology and cleverly called it FreeSync, it was a clear jab at the added cost of G-SYNC displays. As with G-SYNC, it has taken some time from the initial announcement to actual shipping hardware, but AMD has worked with the VESA group to implement FreeSync as an open standard that’s now part of DisplayPort 1.2a, and they aren’t getting any royalties from the technology. That’s the “Free” part of FreeSync, and while it doesn’t necessarily guarantee that FreeSync enabled displays will cost the same as non-FreeSync displays, the initial pricing looks quite promising.
There may be some additional costs associated with making a FreeSync display, though mostly these costs come in the way of using higher quality components. The major scaler companies – Realtek, Novatek, and MStar – have all built FreeSync (DisplayPort Adaptive Sync) into their latest products, and since most displays require a scaler anyway there’s no significant price increase. But if you compare a FreeSync 1440p144 display to a “normal” 1440p60 display of similar quality, the support for higher refresh rates inherently increases the price. So let’s look at what’s officially announced right now before we continue.
350 Comments
View All Comments
P39Airacobra - Monday, March 23, 2015 - link
Why will it not work with the R9 270? That is BS! To hell with you AMD! I paid good money for my R9 series card! And it was supposed to be current GCN not GCN 1.0! Not only do you have to deal with crap drivers that cause artifacts! Now AMD is pulling off marketing BS!Morawka - Tuesday, March 24, 2015 - link
Anandtech, have you seen the PCPerspective article on Gsync vs Freesync? PCper was seeing ghosting with freesync. Can you guys coo-berate their findings?shadowjk - Tuesday, March 24, 2015 - link
Am I the only one who would want a 24" ish 1080p IPS screen with gsync or freesync?xenol - Tuesday, March 24, 2015 - link
FreeSync and GSync shouldn't have ever happened.The problem I have is "syncing" is a relic of the past. The only reason why you needed to sync with a monitor is because they were using CRTs that could only trace the screen line by line. It just kept things simpler (or maybe practical) if you weren't trying to fudge with the timing of that on the fly.
Now, you can address each individual pixel. There's no need to "trace" each line. DVI should've eliminated this problem because it was meant for LCD's. But no, in order to retain backwards compatibility, DVI's data stream behaves exactly like VGA's. DisplayPort finally did away with this by packetizing the data, which I hope means that display controllers only change what they need to change, not "refresh" the screen. But given they still are backwards compatible with DVI, I doubt that's the case.
Get rid of the concept of refresh rates and syncing altogether. Stop making digital displays behave like CRTs.
Mrwright - Wednesday, March 25, 2015 - link
Why do i need either Freesync or Gsync when I already get over 100fps in all games at 2560x1400. All i want is a 144Hz 2560x1440 monitor without the Gsync tax. as gsync and freesync are only usefull if you drop below 60fps.ggg000 - Thursday, March 26, 2015 - link
Freesync is a joke:https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=VJ-Pc0iQgfk&fe...
https://www.youtube.com/watch?v=1jqimZLUk-c&fe...
https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=84G9MD4ra8M&fe...
https://www.youtube.com/watch?v=aTJ_6MFOEm4&fe...
https://www.youtube.com/watch?v=HZtUttA5Q_w&fe...
ghosting like hell.
willis936 - Tuesday, August 25, 2015 - link
LCD is a memory array. If you don't use it you lose it. Need to physically refresh each pixel the same number of times a second. You could save on average bitrate by only sending changed pixels but that requires more work on the gpu and adds latency. What's more is it doesn't change the fact what your max bitrate needs to be and don't even bigger suggesting multiple frame buffers as that adds TV tier latency.ggg000 - Thursday, March 26, 2015 - link
Freesync is a joke:https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=VJ-Pc0iQgfk&fe...
https://www.youtube.com/watch?v=1jqimZLUk-c&fe...
https://www.youtube.com/watch?feature=player_embed...
https://www.youtube.com/watch?v=84G9MD4ra8M&fe...
https://www.youtube.com/watch?v=aTJ_6MFOEm4&fe...
https://www.youtube.com/watch?v=HZtUttA5Q_w&fe...
ghosting like hell.
chizow - Monday, March 30, 2015 - link
And more evidence of FreeSync's (and AnandTech's) shortcomings, again from PCPer. I remember a time AnandTech was willing to put in the work with the kind of creativeness needed to come to such conclusions, but I guess this is what happens when the boss retires and takes a gig with Apple.http://www.pcper.com/reviews/Graphics-Cards/Dissec...
PCPer is certainly the go-to now for any enthusiast that wants answers beyond the superficial spoon-fed vendor stories.
ZmOnEy132 - Saturday, December 17, 2016 - link
Free sync is not meant to increase fps. The whole point is visuals. It stops visual tearing which is why it drops frame rates to match the monitor. Fps has no effect on what free sync is meant to do. It's all visuals not performance. I hate when people write reviews that don't know what they're talking about. You're gonna get dropped frame rates because that means the frame isn't ready yet so the GPU doesn't give it to the display and holds onto it a tiny bit longer to make sure the monitor and GPU are both ready for that frame.