Word comes from NVIDIA this afternoon that they are rolling out a beta update to their GRID game streaming service. Starting today, the service is adding 1080p60 streaming to its existing 720p60 streaming option, with the option initially going out to members of the SHIELD HUB beta group.

Today’s announcement from NVIDIA comes as the company is ramping up for the launch of the SHIELD Android TV and its accompanying commercial GRID service. The new SHIELD console is scheduled to ship this month, meanwhile the commercialization of the GRID service is expected to take place in June, with the current free GRID service for existing SHIELD portable/tablet users listed as running through June 30th. Given NVIDIA’s ambitions to begin charging for the service, it was only a matter of time until the company began offering the service, especially as the SHIELD Android TV will be hooked up to much larger screens where the limits of 720p would be more easily noticed.

In any case, from a technical perspective NVIDIA has long had the tools necessary to support 1080p streaming – NVIDIA’s video cards already support 1080p60 streaming to SHIELD devices via GameStream – so the big news here is that NVIDIA has finally flipped the switch with their servers and clients. Though given the fact that 1080p is 2.25x as many pixels as 720p, I’m curious whether part of this process has involved NVIDIA adding some faster GRID K520 cards (GK104) to their server clusters, as the lower-end GRID K340 cards (GK107) don’t offer quite the throughput or VRAM one traditionally needs for 1080p at 60fps.

But the truly difficult part of this rollout is on the bandwidth side. With SHIELD 720p streaming already requiring 5-10Mbps of bandwidth and NVIDIA opting for quality over efficiency on the 1080p service, the client bandwidth requirements for the 1080p service are enormous. 1080p GRID will require a 30Mbps connection, with NVIDIA recommending users have a 50Mbps connection to keep from any other network devices compromising the game stream. To put this in perspective, no video streaming service hits 30Mbps, and in fact Blu-Ray itself tops out at 48Mbps for audio + video. NVIDIA in turn needs to run at a fairly high bitrate to make up for the fact that they have to all of this encoding in real-time with low latency (as opposed to highly optimized offline encoding), hence the significant bandwidth requirement. Meanwhile 50Mbps+ service in North America is still fairly rare – these requirements all but limit it to cable and fiber customers – so at least for now only a limited number of people will have the means to take advantage of the higher resolution.

NVIDIA GRID System Requirements
  720p60 1080p60
Minimum Bandwidth 10Mbps 30Mbps
Recommended Bandwidth N/A 50Mbps
Device Any SHIELD, Native Or Console Mode Any SHIELD, Console Mode Only (no 1080p60 to Tablet's screen)

As for the games that support 1080p streaming, most, but not all GRID games support it at this time. NVIDIA’s announcement says that 35 games support 1080p, with this being out of a library of more than 50 games. Meanwhile I’m curious just what kind of graphics settings NVIDIA is using for some of these games. With NVIDIA’s top GRID card being the equivalent of an underclocked GTX 680, older games shouldn’t be an issue, but more cutting edge games almost certainly require tradeoffs to maintain framerates near 60fps. So I don’t imagine NVIDIA is able to run every last game with all of their settings turned up to maximum.

Finally, NVIDIA’s press release also notes that the company has brought additional datacenters online, again presumably in anticipation of the commercial service launch. A Southwest US datacenter is now available, and a datacenter in Central Europe is said to be available later this month. This brings NVIDIA’s total datacenter count up to six: USA Northwest, USA Southwest, USA East Coast, Northern Europe, Central Europe, and Asia Pacific.

Source: NVIDIA

Comments Locked


View All Comments

  • yannigr2 - Thursday, May 14, 2015 - link

    They do an exception for Intel because if they don't, PhysX and CUDA is dead. It's simple logic, nothing strange here. It's completely obvious. If AMD was controlling 80% of the CPU business they could have introduce compatibility problems with Intel hardware, hoping to push Intel to bankruptcy and get the x86 license. It's just how Nvidia thinks and works. Nothing new or strange really.

    The Intel IGP proves that PhysX lock is a deliberate move from Nvidia. But even without the Intel IGP example, the fact that a simple patch could enable PhysX with AMD primary GPUs and work just fine, I think it's enough proof here. Why are we still talking about this? I was using Nvidia's 258 UNlocked driver a few years ago with an HD4890 and a 9600GT for PhysX and had no problems. That set up was really nice and I never understood why Nvidia didn't choose to push it's lower end cards as PhysX cards. Especially today with all those IGPs, it could make sense. The only explanation is arrogance. They can not accept a setup where an Nvidia card is secondary to an AMD card. You know, a proprietary standard is not bad when it is accessible from everybody, even if you have to pay for an Nvidia card. But when you lock it the way Nvidia does, it is BAD. Really BAD.

    And while I am a fan of AMD, when I am posting something I base it on facts and logic. It's another thing to be a fan of a company and another thing to be a brainless fanboy, and I hate the second. Especially when I have to talk logic with brainless fanboys. I was a fan of Nvidia when they where not trying to lock the market under their own arrogance. I still own 3 (low end) Nvidia cards. I don't mind owning Nvidia cards, but I can not support their business practices.
  • chizow - Thursday, May 14, 2015 - link

    So you were wrong to claim Nvidia does this unilaterally correct? By the same token, it is plausible Nvidia simply does not want to support multiple IHVs for their own proprietary solution, correct? Given you don't even use their products as your primary driver, what makes you feel like you can dictate terms of their usage?

    To anyone other than a brainless fanboy this all actually makes sense, but to you, obviously hacky workarounds and half-assed, half-broken products is acceptable, given you are an AMD fanboy which necessarily means you are a fan of half-assed solutions.

    As I've said many times, I'm a fan of the best and Nvidia continually comes up with solutions that find new ways to improve and maximize my enjoyment of the products I buy from them. PhysX, G-Sync, and now GameStream are all examples of this. To any non-fanboy that wants these features, the solution is simple. Buy into the ecosystem or STFD.
  • yannigr2 - Thursday, May 14, 2015 - link

    You never fail to show how much of a fanboy you are. A small part of Nvidia's marketing department on the internet.
    Being wrong? How convenient. Dictate terms? Nvidia dictates terms in my own PC. I just protest about that. I didn't knew that I do not have an opinion when using an Nvidia card as secondary. I guess arrogance is something that had passed from the company to it's loyal fanboys.

    I don't have to comment the last two paragraphs. As I said, you are just a small part of Nvidia marketing department on the internet. And those too paragraphs shows exactly that, and YOUR FANATICISM.
  • chizow - Thursday, May 14, 2015 - link

    Cool so you were wrong, just wanted to clarify that. And you chose poorly, so enjoy your decisions! You certainly deserve all AMD has to offer for as long as they offer it! :)
  • yannigr2 - Friday, May 15, 2015 - link

    Yes, I guess they only thing is left for you to do is to troll. But wait. Ah yes. Never forget in every phrase to advertise Nvidia. Typical.
    Thank you for showing us who you are.
  • chizow - Friday, May 15, 2015 - link

    Yes I've once again cleared up egregious misinformation from a proven AMD fanboy, once again I've done the internet a solid.

    No one needs to read your misleading statements about Nvidia unilaterally disabling this feature when a competitor's part is detected, they are simply choosing what they want to support with their own proprietary solution, as they have every right to do.
  • yannigr2 - Saturday, May 16, 2015 - link

    You keep repeating yourself.

    We already know that your love for Nvidia is huge, that you will justify everything they do, and that you will attack at anyone not showing enough love for them.

    We already know that.
  • chizow - Saturday, May 16, 2015 - link

    Yes its important to repeat the fact you as any good AMD fanboy continues to spread BS misinformation as if it were fact in an attempt to disparage Nvidia features that the products you love, do not have.

    And talk about love for products being huge LOLOL from one of the last remaining AMD *CPU* fanboys on the planet, that is rich. That is deep seeded love that dies hard. :D
  • Laststop311 - Wednesday, May 13, 2015 - link

    If only this would work with all brands and not just shield. Then it would be awesome. 30mbit is a pretty darn high bitrate but with movies its only 24 fps so 30mbit isn't as insane as it sounds since it's pushing more than twice as many frames. The same as 12mbit at 24 fps.
  • jamesbond2015 - Wednesday, May 13, 2015 - link

    Maybe the plan is to start rolling it out in some developed industrial country (EU, South Korea, Japan) where 50MBit is standard and not in USA ?

Log in

Don't have an account? Sign up now