'

Google's Stadia will Utilise Custom AMD GPUs and Radeon Developer Tools

Will this push more developers towards Vulkan and AMD's Radeon development tools?

Google's Stadia to Utilise Custom AMD GPU and Developer Tools

Google's Stadia will Utilise Custom AMD GPUs and Radeon Developer Tools

Google's Stadia gaming service will be utilising custom AMD/Radeon graphics hardware, with every instance using a GPU that is capable of delivering more performance than both an Xbox One X and PS4 Pro combined, promising gamers a stellar performance on the cloud. 

AMD has since confirmed to us that Google will utilise AMD's Linux drivers and their Radeon GPU Profiler to ensure that their graphics hardware is optimally utilised and runs at peak efficiency. The low-level Vulkan API will also act as a large part of Google's efforts, an API which finds its origins in AMD's Mantle API, which was later donated to the Khronos Group to act as the baseline for Vulkan. 

We know for a fact that a custom Radeon datacenter graphics card will be used to create Google's Stadia streaming platform, which will run primarily using Vulkan and Linux. At this time it is unknown who manufactures the processors that are to be used within Google's gaming servers, though the use of the term "hyperthreaded" implies that an Intel product will be used. 

On the graphics side, Stadia will use a Radeon graphics card with 56 compute units and offer 10.7 TFLOPS of graphics horsepower, specifications that are similar to AMD's Radeon RX Vega 56 GPU, especially after the 484GB/s memory bandwidth specification is accounted for. That said, the performance that Google mentions would require a Vega GPU with 56 compute units to offer a clock speed of 1495ish MHz, which is higher than the RX Vega 56's 1156MHz base clock and 1471MHz boost clock. 

At this time it is unknown what graphics card Google plans to base their Stadia platform on, though 7nm Radeon GPUs seem likely. 14nm GPUs that are in effect overclocked RX Vega 56 cards would seem overly power hungry for such a large enterprise, making 7nm a great choice for Google and AMD alike. AMD's Vega 20 silicon, which is used to make the Radeon VII, uses four HBM2 memory stacks, offering memory bandwidth levels that are well in excess of what Google's Stadia specifications would indicate. 

Google's Stadia platform product lead, Dov Zimring, has stated that Google has been working with AMD "for years" on this project, making it possible that the company are using a fully custom graphics card to create their cloud gaming systems. That said, Google is likely to upgrade its gaming hardware over time, especially given their plans to offer 8K game streaming in the future. Below is a statement from Google's Dov Zimring. 
 

     We’ve worked closely with AMD for years on this project, leading to the development of a custom GPU with leading-edge features and performance for Google Stadia,

Google and AMD share a commitment to open-source with expertise in Vulkan, open-source Vulkan GPU drivers, and open-source graphics optimization tools. We’re humbled by the spirit of innovation and collaboration that exists throughout the gaming industry and look forward to pioneering the future of graphics technology with game developers, in open-source.

Google's Stadia to Utilise Custom AMD GPU and Developer Tools  
As part of their Stadia announcement, Google confirmed that its Stadia gaming platform could support multiple Radeon graphics cards, enabling increased game performance and better in-game visuals. In theory, Google's Stadia ecosystem could push more game developers into creating games with Vulkan support, multi-GPU support and Linux support, though this might be a little overoptimistic.  

Stadia is a clear design-win for AMD, who was chosen over Nvidia to power Google's online gaming platform. Stadia will go live sometime in 2019, and at this time it is unknown whether or not Stadia will be based on AMD's existing Vega architecture or on something newer. 

You can join the discussion on Google's plans to use AMD's graphics hardware and developer tools for Stadia development on the OC3D Forums

«Prev 1 Next»

Most Recent Comments

19-03-2019, 18:55:52

NeverBackDown
It's cool and all but it'll most likely act a game as a service type deal which is a no go for me really. As soon as I stop paying I lose everything.Quote

20-03-2019, 05:25:48

ET3D
The memory description makes it seem like it's an APU backed by 16GB of HBM2 RAM for both CPU and GPU. I think this makes it more likely to be a single AMD chip than an AMD GPU + Intel CPU. (Although of course we do have the hybrid Intel CPU with AMD graphics, so that's also possible.)

There was a rumour in the past about a console APU with Zen 2 and Navi graphics, and 56 CUs were mentioned. Possibly that rumoured APU, which was thought to go into the PS5 or next Xbox, is actually one created for Google.

The 9.5MB cache figure is strange, though. It's consistent, for example, with a Zen+ CCX with one core disabled. But that doesn't make much sense as a base architecture.Quote

20-03-2019, 05:44:21

tgrech
Personally I think this platform and tech seems far too mature and developed so far for it to be 7nm based in its current incarnation, though it is possible. Though it could just as easily be essentially an RX Vega56 with a 20Mhz increase on the boost clock(Possibly a 56X similar to the recent 64X), that gives you your TFlops figures against the 10.5Tflops often quoted for Vega56 with normal boost. I very much doubt all that RAM is HBM2 or they likely would have said it, given how they split up mentioning VRAM as HBM2 and system memory as 16GB I expect that's an 8+8GB config, especially because HBM2 offers little to no benefit for CPUs at the moment while still significantly increasing cost and reducing yields. The cache sizes also indicate it's a quad-core CPU too as all of Intel and AMDs higher core count parts ship with more cache, so fitting HBM2 to that would be a little pointless.

Lets not forget, AMD are still releasing and manufacturing new 14nm Vega1 SKUs, Vega48 and Vega64X recently appeared in Apple MacBooks/iMacs, maybe they have bucketloads of dies leftover from the crypto boom.Quote

20-03-2019, 06:31:28

ET3D
It's quite likely that any testing to this point was done with currently available hardware. That's normal for any development phase. However, I find it hard to believe that Google will use a Vega 56 in its data centre simply because it's very power hungry. It's possible that Vega 7nm is used, and that would certainly be consistent with lower power at Vega 56 speeds, but it still makes more sense to me that AMD will produce a console-style chip for Google. The up front cost is higher (but something that console makers pay for anyway), but in the long term it's likely to save money.

Apart from power, there's also space, which is also important for data centres. An APU with HBM2 fits everything on the chip, and doesn't require external RAM chips.Quote

20-03-2019, 06:53:37

tgrech
I think they'll definitely switch to 7nm as soon as its viable, but there's no way they're cramming a Vega20 equivalent chip onto an APU atm, even on 7nm Vega20 is still in ~200W range, and even if this is downclocked and has more cores disabled from VII it's a very very long way off from fitting in the thermal design limits of an APU package. The physical density limits of a modern server are due to the heat dissipation equipment rather than the physical dies themselves, the only reason to put two very hot chips closer together is if you think the reduced latency of their connection will improve performance, besides that an APU will always be a big step back in density and performance atm because of the much more proficient cooling required.

Obviously, this can't actually be Vega20 because the memory configuration doesn't match up at all, if it's 7nm Vega it'd have to be a more or less completely custom chip(Different bus width, different HBM2 clocks, and if 16GB then it'd need 8-hi stacks to boot as there can only be two HBM chips with that bandwidth), which they haven't indicated is the case. Personally it seems much more likely to me that the chip which already has mostly identical specs is the chip they're using for this product they're widely demo'ing, rather than a hypothetical chip which would require some really weird configurations to match up with the stated specs while offering little benefit over existing products for such a small scale initial roll out.Quote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.