We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

Comments Locked

118 Comments

View All Comments

  • chizow - Thursday, January 8, 2015 - link

    Yeah, its unfortunate, this is probably based on the early misinformation AMD provided about FreeSync, but they've since changed their tune since FreeSync clearly requires a premium custom scaler in order to work.
  • chizow - Thursday, January 8, 2015 - link

    Interesting, I won't disagree here because I have heard similar sentiments in other gaming notebook discussions, and there is clearly a huge premium attached to slower performing high-end notebook parts. I will caution however, that many notebooks initially implemented 3D Vision capability and 120+Hz ability and it did carry a hefty premium. 3D and 3D Vision interest has since subsided unfortunately, so I am not sure if these panels are still including it or not as a feature. Just something to consider.
  • Creig - Thursday, January 8, 2015 - link

    AMD showed FreeSync working on laptops at last year's CES as their initial FreeSync demonstration. Adaptive-Sync would be a great addition to any laptop as it includes the ability to improve graphics qualities at lower frame rates. As laptops generally have less power graphics subsystems, they would benefit from this technology. In addition, Adaptive-Sync also has power savings abilities as well. All of which would be to the benefit of laptop users.
  • andrewaggb - Thursday, January 8, 2015 - link

    Agreed. I think it makes lot of sense for laptops and for desktops with 4k screens or cheap graphics cards as they will likely have lower framerates and benefit the most.
  • dcoca - Thursday, January 8, 2015 - link

    Hey dude, could you provide references to this said "free" upgrade ur talking about.. like a link to an AMD media article, or a URL from an official statement.. my take in and comment was that if the existing laptops already had the needed standard of displayport 1.2a then it would be up to the manufacturer to implement some type of update if they wish, "wish" being the operative word. Now with that said, funny thing about the universe it's relative to what the subject wants to interpret.. so keep on saying what ur heart wants...
  • chizow - Sunday, January 11, 2015 - link

    I've already linked it in other comments, and it was obviously these comments from Koduri that gave rise to the misinformation many AMD fans clung to for months until AMD reversed course and backed off these statements.
  • codylee - Thursday, January 8, 2015 - link

    It's so nice to see a company put the time in to testing products to make sure they are ready for release!
  • TheJian - Thursday, January 8, 2015 - link

    "Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months."

    Umm, so this was shown running tons of games and they all worked fine then? If not, you shouldn't make claims like this. The whole point of this tech is to MAKE GAMES BETTER, not test windmill demos etc. I think we need to say the jury is still out until GAME TESTED and APPROVED. THEN you can say the type of statement such as the one above. Just another reason NV doesn't send you info early ;) No cake a while back like others got, no 965m info today...LOL.

    What would have been NEW, is if they showed it WORKING in games ;) Also we'll see how long it takes to get them here, and as noted (for the first time on AT?) pricing won't be FREE ;) Anyone who thought it would be FREE was smoking some really good stuff :) For the first time I see you actually hinting here it may cost the same as gsync...Well duh. R&D isn't free (for scaler/monitor makers, testing it all etc), and it WILL be passed on to consumers (AMD or NV solution).

    I don't believe NV will support this in any case. They'll just give away Gsync crap cheaper if forced (they only need a break even here, and it sells their gpus) to entrench it if at all possible. There is also a good chance NOT-SO-FREE sync might not be as good as NV's solution (or they'd be showing games running at CES right?), in which case they can keep charging a premium no matter what the price for NOT-SO-freesync ends up being. Many gamers won't accept 2nd best in this case nor "good enough". IMHO it may be "good enough" for some (why not showing 2 dozen games at CES?), but that will allow NV to stay Gsync anyway claiming the premium solution is only on NV hardware. I can deal with that considering my monitors make it 7yrs+ anyway and that just means I'll buy 2 NV cards or so during the monitors life (I upgrade gpus about every 3yrs).

    Agree with Chizow. The market will keep gsync alive if it is BETTER, period. Considering the market share's of both (1/3 AMD, 2/3 NV and NV gaining, monitor makers know these numbers too) they have no reason to favor a smaller player with possibly a WORSE solution that still hasn't been shown GAMING. I'm worried at this point since it's been ages since they showed NOT-SO-freesync, and have yet to show a bunch of GAME DEMOs running. How bad is this tech if we're supposedly Q1 for monitors and NOBODY will show them running GAMES?

    Nobody else finds this odd? Anandtech just acts like it works with no games tested? Nvidia put it in TESTER HANDS to run GAMES (review sites etc). Smell anything fishy people? Having said all that, I'll wait until the next black friday to decide the victor assuming my monitors can live that long (in year 8 or so now...LOL).
  • tuxRoller - Friday, January 9, 2015 - link

    Actually, Intel is what matters since they've more share than anyone else.
    While adaptive sync is useful for gaming, that's far from its best use. Having a 100% tear free desktop, perfectly synced videos, lower power usage are all at least as useful and will certainly be useful to more people.
  • JarredWalton - Friday, January 9, 2015 - link

    For the record, AMD showed at least two games running with FreeSync. You'll note that there's no "G-SYNC compatible" list of games from NVIDIA for a reason: if the technology works, nothing more is required of the games to enable it! Spouting FUD and posting long anti-AMD diatribes does nothing but create noise. FreeSync was shown running games, and that proves that it can work. I don't know that it's necessarily 100% ready today, but the remaining work is going to be mostly in fine tuning the drivers over the next couple of months.

    If you want to really complain about something, it's that FreeSync requires at least GCN 1.1 to enable the full functionality, so R9 280/280X (7950/7970) and earlier GPUs won't support it AFAICT.

Log in

Don't have an account? Sign up now