Comments Locked

46 Comments

Back to Article

  • damianrobertjones - Wednesday, June 20, 2018 - link

    Ah. Is VRR going to become the new 'buzz word' that's used to sell us the same stuff that we already own for more cash? All I'd like is an actual HIGH QUALITY picture, such as OLED, instead of having to choose from 30 sets from the same oem.
  • A5 - Wednesday, June 20, 2018 - link

    Good HDR/WCG on more 2019 sets + the bundle of side features from HDMI 2.1 should finally be enough of an upgrade to really justify getting off a 1080p set for a lot more people.

    It will be interesting to see how TV makers try to market stuff like QMS and eARC.
  • nevcairiel - Wednesday, June 20, 2018 - link

    VRR doesn't interest me, and probably neither the majority of TV buyers since its a gaming feature, but the increased bandwidth of HDMI 2.1 is whats really winning me over.

    OLED doesn't interest me since it has too many downsides for me, but I hope we can see good Mini-LED backlit screens as well, actual MicroLED screens will probably still take a few more years, and even more years to become affordable.
  • surt - Wednesday, June 20, 2018 - link

    How sure are you that the majority don't play games on TV?
  • nathanddrews - Thursday, June 21, 2018 - link

    In fact, we know using simple math based on the number of video game consoles sold in the US (and globally) and surveys that the majority of households play console video games on televisions. As the gaming generation(s) continues to earn more money and buy new gear, gaming-centric features will only become more important to buyers.

    Like someone else already said, VRR is a godsend for all content. Why bother converting frame rates when you can just match them with the utmost precision. VRR is the best thing HDMI has done since combining audio in the same cable.
  • Camaxide - Thursday, September 27, 2018 - link

    And that's without counting all of us which plays PC-games on the TV - which I've done for years, and many others as well. For both console and PC, lower input lag is amazing and important as current tech simply is unusable for competitive gaming due to high latency as well as low hz most of the times.
  • caqde - Wednesday, June 20, 2018 - link

    It could in the future be used to align the image data with the screen even for TV shows and movies. So even though it benefits games more due to games running at uneven framerates it could still benefit the movie/tv watching experience by sending the picture data to the screen at the correct intervals and making sure only complete images are sent to the screen reducing any possible tearing effects due to misaligned frames due to the screens refreshrate. (24hz, 23.98hz, 25hz, 29.97hz, 30hz, 50hz, 59.94hz, 60hz)
  • BurntMyBacon - Thursday, June 21, 2018 - link

    This!
  • mode_13h - Friday, June 22, 2018 - link

    My TV from 2013 already had native support for these framerates. The point of VRR is *variable* framerates.
  • Diji1 - Friday, June 22, 2018 - link

    It doesn't have native support for those, your TV is either 60 or 120Hz.
  • mode_13h - Sunday, June 24, 2018 - link

    My TV is a plasma with a 480 Hz refresh rate. The input formats it accepts include PAL (25 and 50 Hz). My blu-ray player includes a true 24 FPS output option, which also works as advertised.

    And Wikipedia contradicts you:

    "HDMI 1.2 and all later versions allow any arbitrary resolution and frame rate (within the bandwidth limit)."

    https://en.wikipedia.org/wiki/HDMI#Refresh_frequen...
  • Camaxide - Thursday, September 27, 2018 - link

    Yes and No, your plasma TV can not take a 480hz signal.. it can simply up"sample" say a 50hz signal and add 330 extra fake images to smooth out the image movement. - For TV's that can handle all the different frequencies listed will still benefit from HDMI 2.1 as it will switch between the various modes much faster than before, where as today it usually takes more time to swap modes.
  • wolrah - Tuesday, June 26, 2018 - link

    Part of that variability is allowing for seamless switching between framerates. Most media playback devices do not switch framerates when switching between content because it causes a resync, which with a complicated home theater setup can mean multiple seconds of "no signal" before everything comes back.

    If you can go from watching a 60 FPS US TV show to a 24 FPS movie to a 50 FPS UK TV show without your system resyncing then at least one of those are being displayed incorrectly. Since most "120Hz" TVs don't actually support 120Hz inputs odds are pretty good that both the movie and the TV not matching your local standard are wrong.

    VRR allows for seamless switching, allowing all content to be played at its native rate without annoying interruptions.
  • mode_13h - Tuesday, June 26, 2018 - link

    That's another thing HDMI really needs - faster renegotiation!

    BTW, my disc player is an OPPO, which I leave set to "Source Direct" mode. So, it plays the exact framerate, resolution, and color space of the original source material. My TV's deinterlacing and motion interpolation work much better that way. Probably better scaling, too.

    Switching resolutions doesn't result in a black screen for long. Not nearly as long as turning on/of my A/V receiver, which has a pass-thru function when it's off. I guess some of that time is probably taken by the receiver's boot-up sequence, but I suspect its capabilities need to be queried and are simply read from cache on subsequent mode changes.
  • euler007 - Wednesday, June 20, 2018 - link

    HDMI 2.0's bandwidth became an issue for people wanting to do 4k60fps with HDR @4:4:4 chroma subsampling. What are you doing with your TV that you need the extra bandwidth if it isn't 4k gaming? Incredibly high quality porn?
  • Diji1 - Friday, June 22, 2018 - link

    >VRR doesn't interest me, and probably neither the majority of TV buyers

    You aside who wants lower quality for some reason I'm pretty sure the majority of TV buyers want to watch content at higher quality because there is no need to use motion processing on content with framerates that do not divide into 60 (or 120) if you have variable frame rate.
  • mode_13h - Thursday, June 21, 2018 - link

    Only A/V geeks will care about eARC. But, if you have run into the limitations of existing ARC (Audio Return Channel), you'll definitely want eARC.

    Basically, existing ARC is limited to compressed 5.1-channel and just the formats supported by ATSC (HDTV terrestrial broadcast standard). As long as you're just using it to route the audio signal from broadcast TV to your A/V receiver, it's no problem. But, in days of high-quality internet streaming, you can easily exceed its capabilities, forcing the TV to have to down-convert and re-compress the audio.
  • CheapSushi - Thursday, June 21, 2018 - link

    This is QLED though, which is often better. The ultimate is MicroLED.
  • mode_13h - Friday, June 22, 2018 - link

    Or OLED, for people who can afford to replace their TV every time they start to notice burn-in.
  • SirPerro - Wednesday, June 20, 2018 - link

    I really like the changes that are coming, but I pity the early adopters or this decade.
  • Lindegren - Wednesday, June 20, 2018 - link

    what i would find useful was a usb over hdmi part. it would be nice to have keyboard, soundcard and mouse connected to tv, and pc in another room, one cable only.

    what i find irelevant is 16bit colourspace. Who needs 65536 colour gradients per channel? homestly, 1024 is more than a normal person would see anyway
  • Kevin G - Wednesday, June 20, 2018 - link

    It isn't necessarily about the raw number but how those gradients per channel are divided. An 8 bit smooth gradient per RGB channel is simple to handle in terms of processing and handing it off to a display but doesn't necessarily visually look that good compared to what can be done with 8 bits in other color spaces.

    The other nice thing about higher bit depths is that conversion between color spaces can be more precise. Speaking of, video content is often mastered in YUC color space vs. RGB display.
  • mode_13h - Thursday, June 21, 2018 - link

    8 bits per channel is always going to come up short, no matter the color space.
  • SydneyBlue120d - Wednesday, June 20, 2018 - link

    I wonder if we'll ever see a TV with DisplayPort 1.4 input...
  • wr3zzz - Wednesday, June 20, 2018 - link

    What is the point of setting a standard and then say manufacturers don't need to adhere to it 100%? If certain requirements are not ready in all markets but in demand for some then call one HDMI 2.1a and the other HDMI2.1b.
  • nevcairiel - Wednesday, June 20, 2018 - link

    If you try to distinguish by version number, you get into a real mess really fast. Sure, with HDMI 2.1 it may only be "a" and "b", but with HDMI 2.2 you add another one, now you're at abcd, and it just gets worse from there.

    Optional features are fine, as long as manufacturers properly document which features are available.
  • wr3zzz - Thursday, June 21, 2018 - link

    If a function can be optional then it is technically not "standard". Anything that is optional in 2.1 should not be part of 2.1 but left for consideration for the 2.2 "standard". Manufacturers can and do brag about features all the time but should not be allowed to use the 2.1 moniker if they are not 100% certified.
  • mode_13h - Friday, June 22, 2018 - link

    That's missing the point. The standard is there to ensure interoperability between all devices which *do* implement it. If you don't have the standard dictating how to implement a given feature, there's pretty much 0% chance of two different manufacturers' devices being compatible.

    As an example, I point to lip-synch auto-calibration, before HDMI finally tackled this. Early HDMI support for lip-sync was apparently useless without auto-calibration, which the standard didn't address. Manufacturers arrived at proprietary solutions for this, which helped virtually no one. It wasn't until HDMI took another swing at lip-sync that the problem was finally solved for people.
  • mode_13h - Friday, June 22, 2018 - link

    You're also missing the fact that some of these new features are quite high-end. So, if you force any 2.1 implementations to tackle *all* of it, then what will happen is the industry will virtually ignore 2.1 and remain stuck at 2.0.

    You'd think the high-end products could move forward, but even the high-end counts on the economies of scale driven by lower-end products, since some of the silicon is often shared between them. Having a regime where products can gradually adopt the new features creates a smooth transition path and affords greater economies to high-end products that share some silicon with the lower-end. This paves the way for more features to trickle-down to low-end.
  • vanilla_gorilla - Wednesday, June 20, 2018 - link

    WTF is VRR? I don't see it defined anywhere.
  • surt - Wednesday, June 20, 2018 - link

    Variable Refresh Rate.
    aka support for gaming devices to deliver pictures as completed rather than having to align with a fixed pace refresh rate.
  • mode_13h - Thursday, June 21, 2018 - link

    "Last year the HDMI Forum introduced a more industry-standard approach to variable refresh rate as a part of the HDMI 2.1 package, and recently makers of consumer electronics started to add VRR support to their products."

    So it was there, although it could've been clearer.
  • sunbear - Wednesday, June 20, 2018 - link

    HDMI is completely ill-conceived. All of the video streams on the source end of the cable are already compressed, but stupid HDMI insists on not taking advantage of that fact.

    Let’s say the bitrate from a UHD Disk player source is 50 Mbps. HDMI requires that the player decompresses that bit stream BEFORE sending down the HDMI cable. Stupid! If they sent the stream as-is and instead decompressed in the TV itself then no one would need a cable capable of 48 Gbps, cheap category cable would do the job fine and over 50 meter cable runs (no way any non-optical HDMI cable will be capable of that).

    And before people say “it’s too much to expect the TVs to be able to decompress/decode lots of different types of compressed streams”, remember that most TVs nowadays run Netflix, Amazon video, etc that do exactly that.
  • mode_13h - Thursday, June 21, 2018 - link

    No.

    HDMI needs to support uncompressed content, say from a games console or PC graphics card. So, you can't just cut the supported bandwidth to 50 Mbps or whatever. The cable & display device must support either uncompressed or (as the standard now has added) lossless compression.

    The other scenario you're missing is devices which either apply some overlay, such as A/V receivers, or which do some post processing (i.e. video processors and outboard scalers).

    Next time, I hope you'll think a little harder and maybe educate yourself a bit more, before presuming to be such a vastly superior intellect.

    BTW, streaming compressed video over such distances is already a solved problem. Many TVs support DLNA. So, just use a media server + Ethernet or wifi. No need for HDMI to add support for that use case. And one can even run Ethernet over HDMI, for cables and devices that support it.
  • johnthacker - Thursday, June 21, 2018 - link

    Does Ethernet over HDMI really get used? I haven't seen it in the field. Now, HDMI over Ethernet, that is indeed a popular way to solve the problem of needing long cable runs.
  • mode_13h - Friday, June 22, 2018 - link

    My 2013-era TV can stream via DLNA from devices connected to it that aren't networked. So, I'd say it's there and it works, but you must have both a cable and devices which support it.

    Cable support is required due to leveraging previously-unused pins. I think the same pins might also be used for ARC.

    One of the most annoying thing about HDMI is the cables need standard markings so that you can tell which cable supports what speeds and features.
  • Stefan75 - Monday, July 2, 2018 - link

    Good luck with HDMI over Ethernet... HDMI 2.0 = 19Gbit/s
  • 29a - Monday, June 25, 2018 - link

    Your reply was great until you got to.

    "Next time, I hope you'll think a little harder and maybe educate yourself a bit more, before presuming to be such a vastly superior intellect."

    Then you just turned into a dick, try being a little more friendly next time.
  • mode_13h - Tuesday, June 26, 2018 - link

    Well, let's see. I'm replying to a comment that starts:

    "HDMI is completely ill-conceived."

    ...not an innocent question, like:

    "Hey, why doesn't HDMI just keep the signal compressed?"

    Okay, so right off the bat, @sunbear is presuming that HDMI was slapped together by a bunch of idiots too dumb to do something so obvious. Such a harsh assumption deserves no less harsh a rebuke.

    I have very little patience for people who assume they know better than the developers of the tech we all use. Not that it's perfect, but to litigate your case on the basis of such facile analysis demeans us all.
  • A5 - Thursday, June 21, 2018 - link

    Doing decompression at the source instead of the sink makes it possible for a TV to last 10+ years.

    Imagine you bought a 1080p TV in 2008 that could only decode MPEG2 - it would be useless for everything besides OTA broadcasts and playing DVDs.
  • johnthacker - Thursday, June 21, 2018 - link

    Yeah, and the problem only multiplies when ARC/eARC comes into play. Even people who buy a new TV fairly often want to hold onto their sound bars or receivers. It *would* be possible to demand frequent software and firmware updates to deal with this on TVs or sound equipment, but it makes more sense in most situations to do it on the source side, since that's the new equipment newly supporting some format.
  • Strunf - Thursday, June 21, 2018 - link

    So Freesync is not as free as AMD pretends it to be?... ok it's Freesync over HDMI but considering most consoles use HDMI this was a clever way for AMD to market Freesync as free while making tv makers to pay for Freesync.
  • Despoiler - Thursday, June 21, 2018 - link

    Where does the article say anything about anyone paying for FreeSync? VRR was added to the spec not Freesync. The article says the method for HDMI 2.1 VRR is different than Freesync.
  • ckatech - Thursday, June 21, 2018 - link

    https://www.hdmi.org/manufacturer/hdmi_2_1/ is hdmi.org's information on this specification
  • bankjobs15 - Sunday, June 20, 2021 - link

    Get All Latest Bank Jobs Notification 2020 and most important bank jobs notification in Hindi & English on our website. students can also Get All Govt. Exam Alerts, Govt. & Private Banking Jobs, Microfinance Jobs Notification here. Students can also Download Admit cards, https://www.bankjobs.online/ results, bankjobs Information, sarkari naukri, sarkari rojgar, rojgar result, Admit Card, Latest Bank Jobs, Results, Govt & Private BankJobs, in various bank such as SBI Bank, HDFC Bank, Axis and more free government & Private jobs alert only at one place.
  • itijobs22 - Wednesday, June 30, 2021 - link

    Get all latest Government & Private ITI Jobs notification 2021 and most important ITI Jobs notification in hindi on our website. students can also Get all ITI Govt Jobs Exam Alerts, Sarkari Naukri Notification here.
    https://www.itijobs.co/url">diploma jobs

Log in

Don't have an account? Sign up now