In direct collaboration with NVIDIA, ASUS has announced the world's first 500 Hz G-Sync enabled gaming display during Computex 2022. The latest ROG Swift 24.1-inch display will feature a 500 Hz panel with a 1080p resolution and comes loaded with features such as NVIDA's Reflex Analyzer and a new vibrance mode.

Designed more the fast-paced eSports titles such as Counter-Strike: Global Offensive, Valorant, Overwatch, and titles like these, the latest monitor from ASUS's premium ROG Swift range now features an impressive 500 Hz E-TN display. Not to be confused with the standard TN panels, ASUS uses its new "eSport TN" technology, which ASUS claims offers 60% better response times.

In the above video embedded, ASUS and NVIDIA give a small demonstration of the benefits of moving to 500 Hz from 240 and 144 Hz. It tests things such as animation smoothness, ghosting, and system latency.

The ASUS ROG Swift 500Hz uses a 24.1-inch panel with a 1080p resolution. It also benefits from NVIDIA G-Sync and NVIDIA's Reflex Analyzer, which is designed to detect system latency, measure mouse clicks when using an NVIDIA Reflex certified gaming mouse, and to measure the time for the resulting pixels to change on the screen.

ASUS also includes a new enhanced vibrance mode, which is specifically tuned for eSports and is built into the monitor's firmware. ASUS claims this allows light to travel through the LCD crystals for better color vibrancy.

As we've seen from previous ROG Swift releases over the years, don't expect this to be cheap or affordable for the everyday gamer. At the time of writing, ASUS hasn't revealed the expected MSRP of the ROG Swift 500 Hz gaming monitor, nor does it state when it might hit retail shelves.

Source: ASUS

POST A COMMENT

22 Comments

View All Comments

  • niva - Tuesday, May 31, 2022 - link

    The brain can interpolate successfully at very low frame rates, evidenced by decades of movie and television industry functioning at 24 - 30 fps. That's Hz, not even kHz. Your notion that vision operates at the MHz and GHz level is very misleading. The eyes ability to "see" intense flashes at the nanosecond level doesn't indicate how overall vision functions in terms of fps. If the eye could detect a very small light intensity change at that interval you might have a point, but that's not what that experiment showed. Same experiment, if you had alternating nano second flashes, to our vision they would look like continuous blinding brightness. Matter of fact, if you had those same nanosecond intense flashes only showing up a dozen times through each second evenly spaced apart, the result would be the same.

    Data for pro gamers and their ability to react at the refresh rates shown here (500 Hz vs 144 Hz) are really questionable. We're entering areas at which the latency for what you're seeing on screen is no longer driven by the screen, instead you now have guaranteed longer latency from networks and peripheral processing capabilities. Maybe there are use cases where this really is important though and perhaps one person per a few billion can make use of it.
    Reply
  • mode_13h - Tuesday, June 7, 2022 - link

    Good points. Overall, I think we agree.

    > The brain can interpolate successfully at very low frame rates,
    > evidenced by decades of movie and television industry functioning at 24 - 30 fps.

    There are confounding factors, such as how CRT phosphors very quickly lose intensity and how movie projectors would use a strobe light. Also, cameras (especially older ones) have some amount of in-built motion blur.

    Conventional LCD don't strobe their backlight, which leads to the "latched pixel effect". The consequence of this is a smearing or blurring of objects an your eyes track them across the screen.

    However, visual acuity rapidly diminishes as movement speeds increase, which should mean there's a point beyond which fast-moving objects no longer need to be as sharp. Beyond that point, you should be able to simply use simulated motion blur to achieve a similar user perception as even higher framerates.

    The idea is that a missile streaking across the screen for a fraction of a second should appear as a blurry streak, rather than appearing unblurred in just a few places.
    Reply
  • simonpschmitt - Wednesday, May 25, 2022 - link

    I really like your comment and it improved my day. One can appreciate something new and accept its existence even though it has no personal benefit for oneself. I personally value image quality over fps but that does not have to apply to everyone. And this trend of higher framerates will surely trickle over to other display segments. We already see wide color gamut displays with higher than 60 fps. Reply
  • linuxgeex - Wednesday, May 25, 2022 - link

    Best is a NO_HZ panel. We're a ways off from having enough bandwidth for that, but if the GPU was integrated with the display then the display raster could be framebuffer, and the display could do updates in the 100kHz range even for a 4k display. We have thousands of stream processors bottlenecked by an expensive and slow display cable, so this is a natural evolution which will arrive some day. Reply
  • mode_13h - Wednesday, May 25, 2022 - link

    > if the GPU was integrated with the display then the display raster could be framebuffer

    I don't see how that's supposed to work. If you render directly to the on-screen framebuffer, you get a flickery, jumbled mess.

    You really don't want to watch frames, as their drawn. You only want to see the fully rendered frame.

    > We have thousands of stream processors bottlenecked by an expensive and slow display cable

    Have you looked at game benchmarks, lately? 4k with high detail? In most cases, GPUs aren't close to being limited by the display refresh rate.
    Reply
  • Makaveli - Tuesday, May 24, 2022 - link

    24" 1080 500hz and TN monitor hmm ya hard pass. Gamers really don't care about image quality at all? Reply
  • meacupla - Tuesday, May 24, 2022 - link

    Asus TN gaming monitors have historically had good image quality, for a TN panel.
    I own a PG278Q, and I don't really have any issues with it.
    Reply
  • 7BAJA7 - Tuesday, May 24, 2022 - link

    Had same screen and it was good but it's now outdated. Changed to 32" Asus ROG VA screen and oh boy not only difference in size but colors and blacks are so much better, no ghosting either. Reply
  • Ryan Smith - Tuesday, May 24, 2022 - link

    "Gamers really don't care about image quality at all?"

    Competitive gamers don't. This is the same crowd that turns down image quality settings for lower frame render times. It's all about trying to minimize the end-to-end latency.
    Reply
  • PeachNCream - Wednesday, May 25, 2022 - link

    This is for "professionals" that play "e-sports" (LOL) not for us mere plebs that work at jobs that actually add value to our civilization. You've got to be a sweaty basement dweller with delusions of winning a trophy that says you clicked a mouse button better in Fortnite than some other slob that is disinterested in the idea of education or a career. Reply

Log in

Don't have an account? Sign up now