Comments Locked

29 Comments

Back to Article

  • Kjella - Monday, October 5, 2020 - link

    Looks like Nvidia just killed them all.. the totally-not-Titan RTX 3090, totally-not-Quadro A6000, totally-not-Tesla compute cards... it's just Nvidia. I'm okay with that, don't feel the sub-brands added that much to the table anyway.
  • deil - Monday, October 5, 2020 - link

    maybe they just want to restart naming. Everyone want to do it at some point.
    it makes a lot of sense to have a clear and easy to understand tiers of products....
  • TheinsanegamerN - Monday, October 5, 2020 - link

    The naming did serve a purpose. Driver support for quaddros is different from teslas which is different from geforce GPUs.

    Lumping them all together as "nvidia GPUs" is going to create more confusion.
  • michael2k - Monday, October 5, 2020 - link

    If you didn't know that driver support for quadro is different from tesla which is different from geforce GPUs then there is already confusion.

    I suspect that also means NVIDIA will be unifying their driver support because creating three different release schedules for functionally the same product means three times the test/support resources. If there are key HW differences but they all use the same drivers then you focus on testing the different subsystems but the releases get easier.

    In any case, the confusion isn't any different now than it was before; it always boiled down to 'Which NVIDIA GPU do I want'?
  • webdoctors - Monday, October 5, 2020 - link

    No it was more than that. I imagine buying a Quadro gets you longer drive support? Maybe 2X? Maybe at home tech support/HW replacement? Its gotta be more than just faster HW, usually enterprise stuff has a lot of other logistical upgrades too.
  • LauRoman - Monday, October 5, 2020 - link

    How does that change with a new name, though?
  • aamfk - Monday, October 5, 2020 - link

    you're right. I love quadros because they support Windows Server for operating system selection. There isn't ANY other way to get a decent video card on Windows Server, I mean AMD / ATI has made it impossible from what I've seen (I haven't tested the firepro)
  • Operandi - Monday, October 5, 2020 - link

    Geforce make sense as a consumer facing gaming brand. Quadro on the other hand never made much sense. What exactly is a "Quadro" and is there 4 of them? I don't get it.
  • Murloc - Monday, October 5, 2020 - link

    quadro means painting or square-shaped or scene in Italian. I don't think this word has any meaning in English.
  • Kjella - Monday, October 5, 2020 - link

    Not exactly quadro but you find something similar in words like quadruplets, quadratic formula, four of a kind in poker is called quads, four-cheese pasta/pizza/sauce often goes by the italian name (quattro formaggio), some car brands use quattro to indicate all (i.e. 4) wheel drive etc. so even to an English-speaker it sounds like four something. It's just completely unclear what that means for a graphics card.
  • LauRoman - Monday, October 5, 2020 - link

    Still italian/latin
  • MrVibrato - Monday, October 5, 2020 - link

    Yeah. Trying to market professional products to a commercial customer base by individual product brands is probably not that effective. Procurement/purchasing managers rarely care about product brands as much as they care about reputation of and past business with the vendors/manufacturers they are dealing with.

    For wooing commercial customers, a solutions-based marketing approach makes more sense. You can see this by just visiting Nvidia's website...

    That said, i was and still am curious about the origin of Nvidia's Quadro branding. Why "Quadro"? 4-way SLI? 4 monitors per graphics card? 4 times as expensive?
  • Zagor Te Nay - Monday, October 5, 2020 - link

    Geforce makes sense as a consumer gaming card line only because we have associated term with gaming cards.

    It spells different, but it sounds like G-force... something that flying enthusiasts (oblivious to gaming) would associate with fighter and acrobatic pilots and airplanes.

    Just as well, Quadro is deeply associated with pro graphics, even if term doesn't naturally belong to that vocabulary. I don't see obvious reason to change that... curious to learn what motivated Nvidia.
  • Operandi - Tuesday, October 6, 2020 - link

    Geforce make sense because it implies speed and high performance, its great branding. WTF does Quadro mean in terms of professional graphics???, oh yeah right, nothing.
  • im.thatoneguy - Tuesday, October 6, 2020 - link

    Quadro in the pro graphics area usually means "If you like to spend way more money than you probably need to, by all means feel free to buy it for my workstation but I would rather take the cash."
  • jabber - Thursday, October 8, 2020 - link

    Or "Thanks I'll take that low end $400 Quadro for $40 on Ebay that you had to swap out of that HP/Dell workstation!

    Thanks very much!
  • euler007 - Friday, October 16, 2020 - link

    For me it meant Autodesk supported driver, tons of ram and four DP output.
  • edzieba - Monday, October 5, 2020 - link

    "Notably this generation, however, this means the A6000 is in a bit of an odd spot since DisplayPort 1.4 is slower than the HDMI 2.1 standard also supported by the GA102 GPU. I would expect that it’s possible for the card to drive an HDMI 2.1 display with a passive adapter, but this is going to be reliant on how NVIDIA has configured the card and if HDMI 2.1 signaling will tolerate such an adapter."

    One would expect the not-Quadros to work the same way the Ampere GeForce cards and past GeForce and Quadros do: the Pd ports are DP++ ports, and are electrically both DP and HDMI (requiring only a passive adapter to change the port shape and pinout). There's no good reason to put anything other than physical DP ports on the rear of a card, other than regular consumers getting int a panic even if you include an adapter in the box.
  • repoman27 - Monday, October 5, 2020 - link

    The latest version of the Dual-mode DisplayPort (DP++) standard, which is what enables passive DP to HDMI adapters, only supports up to HDMI 1.4b. HDMI 2.1 will require an adapter with an active Level Shifter/Protocol Converter (LSPCon), as did HDMI 2.0b.

    In the NVIDIA Ampere architecture whitepaper for the GA102, the maximum supported resolutions for both the DisplayPort and HDMI interfaces are the same (8K @ 60Hz + HDR or 4K @ 240Hz + HDR) and require DSC anyway. So once there's an LSPCon that can output an HDMI 2.1 Fixed Rate Link signal with DSC 1.2a and HDCP 2.3, there should be no problem with fully supporting HDMI 2.1 via an active adapter.
  • Kevin G - Monday, October 5, 2020 - link

    Even active adapters will have a few issues. Thus far the HDMI 2.1 support of the Ampere line has been focused at a 40 Gbit/s link rate. Most 4320p displays seen to be circling around that data rate. The bad news is that DP 1.3/1.4 leverage 8/10 encoding which means a good chunk of their bandwidth is lost in overhead vs. HDMI 2.1's usage of more efficient 64/66 encoding. The effect is that HDMI 2.1 has roughly ~50% more usable bandwidth than DP 1.3/1.4.

    It seems the full 48 Gbit/s link rate will be reserved for the true 8192 x 4320 DCI resolution used in cinema and a few other niche applications. At the full 48 Gbit/s link rate, HDMI 2.1 has ~85% more bandwidth.

    Really curious if we'll just see DSC pass through for 4320p60 or if an active adapter will shift from 1:3 to 1:2 compression from DP 1.4- > HDMI 2.1.
  • repoman27 - Monday, October 5, 2020 - link

    Active adapters aren’t really passing anything through. They provide a DP 1.4a sink, decode then re-encode the stream, and output as HDMI. Anything that requires more than a 4-lane HBR3 link (25.92 Gbit/s) gets DSC, which effectively supports up to 3x that bitrate, or 77.76 Gbit/s.

    So if the adapter is HDMI 2.1 FRL capable, it can output bitrates up to 42.67 Gbit/s before needing to resort to DSC. However, GA102 doesn’t support resolutions greater than DP 1.4a + DSC can handle, so the fact that HDMI 2.1 can provide 64.6% more bandwidth is sort of a moot point in this case. And in any case until we see a display that supports more than 77.76 Gbit/s over a single HDMI link.

    HDMI 2.1 FRL is 12 Gbit/s x4 lanes with 16b/18b encoding. Unlike TMDS mode, in Fixed Rate Link (FRL) mode, the link always runs at exactly that rate with bit-stuffing to make up the balance, just like DisplayPort. Are there even any HDMI 2.1 FRL capable displays available in the market yet?
  • Otritus - Monday, October 5, 2020 - link

    RTX 3090 has a tdp of 350w, so the A6000 and A40 are consuming 50 watts less power, not 20
  • Spunjji - Monday, October 5, 2020 - link

    Yup - they've mixed it up with the 3080 TDP.
  • roghjo - Monday, October 5, 2020 - link

    The article doesn't list it here, but the clock speed of the Ampere A6000 is 1,860Mhz with a Clim of 2100mhz and an Mlim of 4000mhz. This information comes from an A6000 that was benchmarked by somebody on userbenchmark. https://gpu.userbenchmark.com/SpeedTest/1300600/NV...
  • Twister292 - Tuesday, October 6, 2020 - link

    I don't get what nVidia are doing with these...at least with the previous lineup, we knew what driver optimisations were enabled. Geforce = half-rate Tensor throughput, no CAD/workstation optimisations enabled, Titan = full rate Tensor throughput and some workstation optimisations enabled, Quadro = all workstation optimisations enabled, ECC memory.

    The current "Titan-class" GPU (3090) does not have any workstation optimisations enabled and performs worse than the Titan RTX in CATIA etc...nVidia's only answer to that is that it's optimised for "content creators"...basically they want workstation users to move up to the A-series (Quadro) it seems
  • a5cent - Tuesday, October 6, 2020 - link

    SR-IOV Support???????????????????????????????????????????????????????
  • colonelclaw - Tuesday, October 6, 2020 - link

    I wonder if Nvidia will be enabling professional 3D application support on GeForce cards, hence the dropping of Quadro? Before you laugh off this suggestion, consider that hell froze over last year when Nvidia went half-way there by enabling support of professional content-creation apps; it's not beyond the realms of possibility.
    Before you say 'everyone will just buy a 3080/3070', I would point out that 10GB of memory is definitely on the low side for pro 3D work.
  • im.thatoneguy - Tuesday, October 6, 2020 - link

    Yeah the Quadro optimizations are only relevant to a small fraction of pro users. If you aren't doing live video you have no need for timecode genlock. If you aren't designing cars you don't need esoteric NURBS features.

    For each person working at Ford on designing a car there are 10,000 professionals poly modeling Game and VFX content. Hell also froze over when Nvidia released Remote Desktop OpenGL support earlier in the pandemic (although parsec and PCoIP have largely removed that use case). Also there is evidence that Microsoft is going to integrate GPU virtualization in Windows 10, they already do it in the Xbox. That would eliminate one other narrow use case of Quadro.

    Quadro is such a niche within a niche that almost everybody who buys a Quadro is completely wasting their money. A Titan/3090 class product plus the Creators Ready Drivers are more than enough for 95% of all workstations. The last real differentiator is 48GB of memory per card. But even then for every VFX artist needing 128GB of GPU memory, there are 1,000 Cinema4D or keyshot users who use like 1GB of memory for a single model or motion graphics.
  • Vitor - Tuesday, October 6, 2020 - link

    Imagine those beasts in 3nm and hbm2e. In a couple of years i hope.

Log in

Don't have an account? Sign up now