FreeSync Gaming on the LG 34UM67

The LG34UM67 is a great example of what LG does right as well as where it falls short. FreeSync is a technology largely geared towards gaming, but LG strikes new ground in ways that can be both good and bad. The UltraWide 21:9 resolution can be a blessing or a curse, depending on the game – even in 2015, there are sadly numerous games where the aspect ratio causes problems. When it works, it can provide a cinematic experience that draws you into the game; when it doesn’t, you get stretched models and a skewed aspect ratio. Sometimes registry or configuration file hacks can fix the problem, but 21:9 is still new enough that it doesn’t see direct support in most games.


My, Lara, you’ve let yourself go….


Tomb Raider with registry hack to fix the aspect ratio (using 2560/1080*10000 = 23703/0x5c97).

Similarly, the use of an IPS panel can be good and bad. The good news is that you get the wide viewing angles associated with IPS – and really, for a 34” display you’re going to want them! – but at the same time there’s a question of pixel response times, with most IPS panels rated at 5ms+ compared to TN panels rated at 1-2ms. LG specifies a response time of 14ms for the 34UM67, though they don’t mention whether that’s GtG or Tr/Tf. There’s also a setting in the OSD to improve response times, which we used to capture the following images with a 1/400s shutter speed. In the gallery below, we also compare the LG 34UM67 with the ASUS ROG Swift to show how the two panels handle the same content (from AMD’s FreeSync Demo).


LG 34UM67


ASUS ROG Swift PG278Q

My personal opinion is that LG's 14ms response time value may be incorrect, at least depending on the setting. The ASUS ROG Swift clearly has a faster response time in the above images and gallery, and if we compare best-case ghosting results, the “Normal” setting on the ASUS is very good while even the “High” setting on the LG still shows about two-thirds of the blades ghosting – I had some other images where the ghosting indicates the transition between frames occurs by the time around half of the display has been updated (~8ms). But the windmill in AMD's FreeSync demo is actually something of a best-case scenario if you happen to enable overdrive features. Let's look at what may be less ideal: F1 2014.


LG 34UM67 with Image Response on High


ASUS ROG Swift PG278Q with "Normal" Overdrive

Here the tables turn, with Normal Overdrive on the ASUS display causing some rather obvious artifacts, and if you enable Extreme Overdrive it can be very distracting. The LG display by comparison doesn't show any artifacting from increasing the Response Time setting, and at High it shows much less ghosting than in the windmill demo. I still like the higher refresh rates of the ASUS display, but I also very much prefer the IPS panel in the LG. The long and short of the response time question is that it's going to depend at least in part on the content you're viewing. Personally, I was never been bothered by ghosting on the 34UM67; your mileage may vary.

Perhaps the biggest flaw with the LG 34UM67 however comes down to the implementation of FreeSync. While FreeSync is in theory capable of supporting refresh rates as low as 9Hz and as high as 240Hz, in practice the display manufacturers need to choose a range where their display will function optimally. All crystal matrices will experience some decay over time, so if you refresh the display too infrequently you can get an undesirable flicker/fade effect in between updates. The maximum refresh rate is less of a concern, but if the pixel response time is too slow then refreshing faster won’t do any good. In the case of the 34UM67 and 29UM67, LG has selected a variable refresh rate range of 48 to 75 Hz. That can be both too high (on the minimum) and too low (on the maximum).

What that means is that as long as you’re running at 48 FPS to 75 FPS in a game, everything looks nice and smooth. Try to go above that value and you’ll either get some moderately noticeable tearing (VSYNC Off) or else you’ll max out at 75 FPS (VSYNC On), which is also fine. The real issue is when you drop below 48 FPS. You’re basically falling back to standard LCD behavior at that point, so either you have very noticeable tearing with a 75Hz refresh rate (AMD tells us that they drive a display at its max refresh rate when the frame rate drops below the cutoff) or you get stutter/judder from subdividing a sub-48 FPS frame rate into a 75Hz refresh rate. This is definitely an issue you can encounter, and the limited 48-75 Hz FreeSync range is a real concern.


Tearing is visible in the center of the windmill.

Some will point at this and lay the blame on AMD’s FreeSync and/or DisplayPort Active-Sync, but really that’s just a standard that allows the GPU and display to refresh at variable rates. The real problem here is the minimum refresh rate chosen by the manufacturer. AMD can still potentially improve the situation with driver tweaks (e.g. sending a frame exactly twice when the GPU falls below the minimum supported refresh rate), but while that should work fine on something like the Acer or BenQ FreeSync displays that support 40-144Hz, the two LG displays (34UM67 and 29UM67) have both the highest minimum and the lowest maximum refresh rate and so it won’t work quite as well. Of course all that is a moot point with the current AMD drivers, which leave you with a choice between tearing or judder at <48 FPS.

Ultimately, the gaming experience on the LG 34UM67 ends up being both better and worse than what I’ve seen with G-SYNC. It’s better in the sense that IPS is better – I’ve had a real dislike of TN panels for a decade now, for all the usual reasons. I’m not bothered by the response times either, and armed with an AMD Radeon R9 290X there are really not too many occasions where falling below 48 FPS is a problem. We typically look at 2560x1440 Ultra quality settings when comparing high-end GPUs, and the R9 290X usually is able to hit 48+ FPS or higher in most recent games. Where it falls short, a drop to Very High or High settings (or disabling 4xMSAA or similar) is usually all you need to do. Now couple that with 25% fewer pixels to render (2560x1080 vs. 2560x1440) and you will typically see frame rates improve by 20% or more compared to WQHD. So if you have an R9 290X (which can be had for as little as $310 these days), I don’t see falling below 48 FPS as a real problem… but going above 75 FPS will certainly happen.

On lesser hardware the story isn’t quite so rosy, unfortunately. The $240 Radeon R9 285 will mostly require High settings at 2560x1080 in demanding games, and if you have anything slower than that you will frequently not hit the 48-75 FPS sweet spot. Since the primary reason to buy a FreeSync capable display is presumably to avoid tearing and judder (as much as possible), what we’d really need to see is panels that support variable refresh rates from 30-100+ Hz at a minimum. The Acer and BenQ FreeSync displays are closer (40-144 Hz), but the 30-40 FPS range is still going to be a better experience on G-SYNC right now. If AMD can tweak their drivers to understand the minimum refresh rates of FreeSync monitors they might be able to work around some of the issues (e.g. by sending two frames at 78 Hz instead of one frame at 39 Hz), but until/unless that happens there are cases where G-SYNC simply works better.

Of course, G-SYNC displays also carry a price premium, but some of the price difference appears to go towards providing better panels or at least a "better" scaler. Again, this isn’t a flaw with FreeSync so much as an issue with the current generation of hardware scalers and displays. Long-term I expect the situation will improve, but waiting for driver updates is never a fun game to play. Perhaps more importantly however, the FreeSync displays are at worst a better experience on AMD GPUs than the normal fixed refresh rate monitors that have been around for decades. AMD can’t support G-SYNC, so the real choice is going to be whether you want to buy a FreeSync display now or use a “normal” display. The price premium doesn’t appear to be any more than $50, and it might be even lower once the newness fades a bit. Everything else being equal, for AMD GPUs I’d rather have FreeSync than not, which seems like the goal AMD set out to achieve.

LG 34UM67 Introduction and Overview LG 34UM67 Brightness and Contrast
Comments Locked

96 Comments

View All Comments

  • wigry - Wednesday, April 1, 2015 - link

    In the article, it was said that 34" ultra wide screen is enormous - bigger than most people are used to. Well it might be at first glance, but you get used to it within a day or two. However lets see what a 34" monitor really is?

    The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio.

    Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display.

    I personally upgraded from 23" 16:9 display to 34" 21:9 curved ultra wide (Dell U3415W) and am very satisfied.

    Anyhow the 34" display is not enormous, just a bit bigger - if you are used to 27" then it is athe same but a bit wider, if you are used to 30" then it is already smaller display.
  • dragonsqrrl - Wednesday, April 1, 2015 - link

    "The 34" monitor with 21:9 aspect ratio is nothing more than standard 27" display, 16:9 aspect and 2560x1440 resolution with extra 440 pixels added to both sides to extend the width to get 21:9 aspect ratio."

    Except this monitor doesn't have a 3440x1440 panel, it's 2560x1080...

    "Therefore if you are used to for example 30" 16:9 display then going to 34" 21:9 is going backwards to smaller display."

    What do you mean by smaller? He stated in the conclusion that it's significantly wider than his old 30" monitor.
  • JarredWalton - Wednesday, April 1, 2015 - link

    Exactly. The LG 34UM67 measures 32.5" wide; my old 30" WQXGA measures 27.5" wide. On most desks, a five inch difference is quite noticeable, and going from a 27" display that was 25.5" wide makes it even more noticeable. Is it bad to be this big, though? Well, not if you have the desk space. I still want the 3440x1440 resolution of the 34UM95, though.
  • wigry - Thursday, April 2, 2015 - link

    Well all I can say, that for many, the vertical space is more important than horizontal space. HMany refuse to go from 1200 to 1080 vertical pixels regardless of the width. So if converting to ultra wide screen, watch out for the vertical dimensions, both physical as pixel count and make sure that you are willing to make the necessary compromises.

    Also regarding the reference to 27" monitor, I again took my own Dell as an example (for some reason assumed that all LG panels are also 1440 px high). However as the height is 1080 then it is compareable to 1920x1080 display that is streched to 2560 pixels (320 pixels added to both sides)
  • Ubercake - Wednesday, April 1, 2015 - link

    I love LG IPS panels. My television is an LG IPS type. They are among the best I've seen. Color accuracy is not something I consider important while gaming so I don't get hung up on this aspect. Also, viewing angles are important if you don't sit directly in front of the monitor when you game, but who in the heck doesn't sit dead center while they game? Viewing angle is another of the less important aspects of a gaming monitor.

    This monitor offers far better contrast than any G-sync monitor so far and contrast is really important when your enemy is camped out in the shadows in a multi-player FPS and should absolutely be considered when looking for a gaming monitor. I also like the resolution/aspect ratio of this monitor for gaming.

    Three things that would keep me from buying this monitor:

    1) Can't go below 48 fps or above 75 fps without introducing tearing. Games like Crysis 3 or those games using TressFX like Tomb Raider most definitely bring framerates below the 48Hz/48 fps horizon with details and AA cranked for a 290x or GTX 980. Check multiple benchmarks around the web and you'll see what I mean. Why bother? You have a range of 27fps (48Hz-75Hz) in which your games have to run in order to get any free sync advantage.

    2) AMD stated there wouldn't be a price premium, yet there is. All the hype prior had every AMD rep saying there is no added cost to implement this technology, yet there really is because there is a change to the production process. Apparently, many manufacturers have not bought into the adaptive sync "standard" yet.

    3) The color gamut on the ROG Swift is slightly better than this IPS monitor. I stated color accuracy is not that important to me, but if I'm buying an IPS monitor, it better provide better color accuracy than a TN.

    Also, input lag is a measurable aspect. Not sure why this was essentially left out of the review.
  • Crunchy005 - Wednesday, April 1, 2015 - link

    Ya i agree that the 27Hz range is dumb, it needs a wider range and freeSync can support a much wider range. FreeSync actually has a far wider range than G-Sync so when a monitor comes out that can take advantage of it it will probably be awesome. I'm sure the added cost premium is the manufacturer trying to make a few bucks off of something "new" not really AMDs fault but nothing they or anyone can do about it except LG. Also might cost more because you get a different scaler that might be higher quality than another who knows.
  • Ubercake - Wednesday, April 1, 2015 - link

    Potential is nothing unless realized.

    This is a poor implementation with that limited frequency range.

    I find the best part about the dynamic refresh monitors, for instance, in the case of a GTX 980 and a ROG Swift monitor, you can use one flagship video card with the G-sync monitor and that's all you need for great gaming performance.

    No more multi-card setup is needed to crank the frame rates out of this world on a high-refresh monitor in order to minimize tearing.

    As long as you have a card that keeps frame rates near 30 and above at a given resolution and detail level, you get great performance with the ROG Swift.

    With this monitor, you're going to have to keep the frame rates consistently above 48 fps to get equivalent performance with this LG monitor. This may seem easy with most titles and an 290 or 290x, but like I said earlier, try something like Crysis 3 or Tomb Raider and you'll find yourself below 48 fps pretty often.
  • JarredWalton - Wednesday, April 1, 2015 - link

    Crysis 3 and Tomb Raider run above 48 FPS quite easily on a 290X... just not with every possible setting maxed out (e.g. not using SSAA). But at Ultimate settings, 2560x1080, Tomb Raider rand 72.5 FPS average and a minimum of 58 FPS.

    Crysis 3 meanwhile is quite a bit more demanding; at max (Very High, 4xMSAA) settings it ran 33.2 FPS average, with minimum of 20.1 FPS. Dropping AA to FXAA only helps a bit, 40.9 FPS average and 25.4 minimums. Drop machine spec from Very High to High however and it bumps to 60.6 FPS average and 45 FPS minimum. If you want to drop both Machine Spec and Graphics to High, 48+ FPS isn't a problem and the loss in visual fidelity is quite small.
  • gatygun - Tuesday, June 30, 2015 - link

    And at the end there is no 21:9 gsync monitor, so that 980 will be useless. Also the swift costs about 400 more then the 29um67 ( which should be looked at not the 34 model as the ppi is absolute shit ) and the 980 costs 200 more then a 290x. That means it's pretty much paying double the price in total.

    Is it worth it sure, but the price is just way to expensive and you won't have a 21:9 screen, but a 1440p screen which will result in needing more performance then a 1080p ultra wide screen.

    Getting 48 fps should be looked at getting a stable 60 fps. Even a 670 can run ~40 fps in crysis 3 on a 3440x1440 ultra wide screen resolution, a single 290 won't have issue's with maintaining 48 fps if you drop the settings a few notch. Most ultra settings are just there to kill performance anyway for people to keep on buying high performance cards.

    The 29um69 in my view is a solid product and a good cheap alternative 21:9 which does perform solid. the 14 ms they talk about isn't grey to grey ms, its 5ms. Which for IPS is pretty much the best you can get. This screen is about as fast with input lag as any 5ms nt gaming monitor.

    It also helps that it features 75hz.
  • gatygun - Tuesday, June 30, 2015 - link

    addition, it's pretty much the best gaming 21:9 monitor on the market, for a cheap price on top of it

Log in

Don't have an account? Sign up now