AMD Shows Off Multiple FreeSync Displaysby Jarred Walton on January 8, 2015 12:43 AM EST
We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.
More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]
Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.
On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.
Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.
Post Your CommentPlease log in or sign up to comment.
View All Comments
FlushedBubblyJock - Tuesday, February 24, 2015 - linkblind and ignorant AMD fan bloviates again...
" Not to mention G-sync locks you into nvidia GPUs only while FreeSync is based off an industry standard which any company is free to develop. "
The only other company is AMD, so AMD is locking you in with freesync, SO YOU AMD FANS NEED TO STOP THIS BULL.
Thanks for not THINKING AT ALL.
ppi - Thursday, January 8, 2015 - linkAnd what do you think G-Sync is doing when then the computer draws frames faster than max refresh rate? It just waits for the next sync, effectively working as Freesync with V-Sync on. Unlike with nVidia, you can choose here, so a plus point.
The difference is however in low frame rates, where it looks AMD gives choise between tearing and V-Sync frame drops (it would be ideal to choose V-Sync independently for top/bottom frame rates), while nVidia lets pixels dim down (effectively letting them below the designed minimum refresh rate for the panel) and thus you get blinking. None of those three options is ideal and I am not certain, which was is optimal one.
Jury is still out till the independent reviewers get a chance to review both panels side by side. Given the width of Freesync offer shown at CES and the manufactures backing it, it is certain to get some real traction. And then nVidia could enable it generation of cards (while keeping G-Sync as well, of course).
chizow - Friday, January 9, 2015 - linkNo, G-Sync can't draw frames faster than max refresh, because it uses a driver soft cap, which is curious as to why AMD didn't do this by default. But even at the max refresh of 120 or 144Hz, the GPU is still the one calling the shots, telling the G-Sync module to draw a new frame only when the monitor's refresh is ready. If the monitor isn't ready, the G-Sync module with the onboard DRAM acts as a lookaside buffer that allows the monitor to simply hold and continue to display the last frame until the monitor's refresh is ready, which then displays the next live frame (not old frame like with Vsync) that is sent from the GPU to the G-Sync module. The end result is just a perceived reduction in FPS rather than an input laggy/juddery one, as you would see with Vsync.
There are a lot of questions for certain with AMD because 40Hz is still a very questionable minimum especially on a 4K display; I am honestly not sure what kind of AMD set-up you would need to ensure you meet this minimum (3x290X?) always. While Nvidia's minimum really only becomes an issue ~20Hz but realistically, it only really manifests itself on menus in certain games that drop their frame rate because some games tended to overheat (SC2, GTA4 etc) when running uncapped frames.
But yes, independent reviews I am sure will give us more answers, but realistically, there's no reason to expect anything other than vendor lock-in for a tiny niche of specialty monitors that cost more for this premium feature. Any AMD fan who doesn't think they are going to pay a significant price premium (maybe not as much as G-sync, but definitely not free!) is deluding themselves.
Stonedofmoo - Thursday, January 8, 2015 - linkMore to the point when are laptop users going to see freesync or gsync on laptop screens!
If ever there was a need for these technologies it's the laptop market where even gaming laptops only feature 60hz screens and the most powerful GPU is only 75% the power of the most powerful desktop cards which increases the chances of dipping below 60fps greatly...
chizow - Thursday, January 8, 2015 - linkDo you think laptop users are willing to pay the premium for something that won't be portable between GPU/hardware upgrades? At least on the desktop side of things, I think many will be more willing to invest in one of these panels if they cost $600+ if they know it will survive at least 1-2 upgrade cycles.
But maybe this is a question for AMD at some point, since they were making some pretty bold claims this time last year at CES about FreeSync. Since they claimed this was always a feature of laptops using the eDP specs along with the comments they made about being free and needing only a simple firmware update, maybe they can get FreeSync working on these laptops, for "Free"?
Stonedofmoo - Thursday, January 8, 2015 - linkWell yes, have you seen the price of gaming laptops and also the news that they are selling in far greater numbers then ever before.
The users of gaming laptops are often desktop gamers themselves and being included in those numbers myself seeing laptop gaming reach the standard of desktop gaming is exactly what we want to see, and would be willing to pay for. Freesync/Gsync is something I personally see as a must for laptop gaming to be fully embraced.
The early demonstrations of freesync was actually done with regular laptops that didn't require additional hardware. From what I believe the technology is there, just presumably no one has yet embraced the market and provided an option. I'm hoping the royalty free freesync spurs the laptops manufacturers on.
Yojimbo - Thursday, January 8, 2015 - linkDoes not require additional hardware? Are you sure? It requires no additional hardware above the "DisplayPort Adaptive-Sync standard", but to support this standard the monitors seem to need additional hardware because otherwise why would AMD being showing off these new FreeSync monitors?
JarredWalton - Thursday, January 8, 2015 - linkExactly. There's no extra licensing fee for FreeSync, but the hardware required to run it will certainly be more expensive than the hardware required for a standard static refresh rate display. Besides the scaler, you'll need a panel that can actually handle dynamic refresh rates, and in most cases the panel will also need to be able to handle higher than normal refresh rates (e.g. 144Hz instead of only 60Hz). We need to see final pricing on shipping FreeSync and then compare that with G-SYNC equivalents to say how much NVIDIA is actually charging.
And for all the hate on G-SYNC, remember this: FreeSync wouldn't even exist if NVIDIA hadn't made G-SYNC. They first demonstrated G-SYNC in 2013, and hardware has been shipping for most of last year (though it wasn't until the second half of the year that it was readily available). Considering the R&D cost, hardware, and need to get display vendors to buy in to G-SYNC it's pretty amazing how fast it was released.
Will Robinson - Sunday, January 11, 2015 - linkI keep re-reading this thread Jarred but the only hate I see is for FreeSync by hardcore NV fans.
Or is that what you meant to type?
chizow - Sunday, January 11, 2015 - linkWhere is there hate for FreeSync? If it does what G-Sync does, that's great! AMD fans may finally get to enjoy the tech they've been downplaying for nearly a year!
I have a disdain for FUD and misinformation, that's it, and as we have seen, AMD has a strong penchant for making unsubstantiated claims not only about their unreleased/undeveloped solutions, but about their competitor's solutions as well.
Do you appreciate being lied to? Just wondering and trying to understand how some of AMD's most devout and loyal fans don't seem to be misled or lied to, at all. :)