'

Nvidia Turing uses RTX and DLAA tech to deliver a 6x performance boost over Pascal

Impressive work, but we won't see these gains in games

Nvidia Turing uses RTX and DLAA tech to deliver a 6x performance boost over Pascal

Nvidia Turing uses RTX and DLAA tech to deliver a 6x performance boost over Pascal

 At their SIGGRAPH 2018 keynote, Nvidia revealed their Turing graphics architecture, providing the industry with their biggest technological leap since the introduction of CUDA in 2006. 

Earlier this year, Nvidia showcased a Star Wars short called "Reflections", which required four GV100 graphics cards to run in real-time. The demo used Unreal Engine 4 alongside actual Star Wars assets from Industrial Light and Magic to showcase the benefits of real-time ray tracing with Microsoft's DXR (DirectX Ray-Tracing) API. Today, Nvidia can run this demo using a single Turing-powered Quadro RTX graphics card. 

Nvidia claims that their Turing architecture can offer up to a 6x performance increase when compared to Pascal, dropping 308ms frametimes down to 45ms. This performance boost comes from three areas, the increased shader performance offered by Nvidia's latest graphics design/architecture, Nvidia's Ray Tracing specified RT processing cores and DLAA (Deep Learning Anti-Aliasing) technology. 

The first source of additional performance is simple, as Nvidia's most powerful Pascal-powered professional graphics card is the company's Quadro P6000, which offers users 12 TFLOPS of FP32 performance. Nvidia's new Quadro RTX 8000 delivers up to 16 TFLOPS of compute performance, giving Nvidia's latest Quadro RTX a 33% boost over Nvidia's Pascal-based Quadro 6000. 

The second gain comes from Ray Tracing Acceleration, which is not available on Pascal graphics cards, allowing Nvidia's Quadro RTX graphics card to complete the demo's ray tracing workloads several times quicker than any Pascal-based setup. Assuming that Nvidia's bar chart is to scale, the Ray Tracing section of this workload is completed almost 11x faster than before, accounting for the bulk of Nvidia's/Turing's performance gains.    

  

Nvidia Turing uses RTX and DLAA tech to deliver a 6x performance boost over Pascal  

The final part of Nvidia's performance equation is "Nvidia DLAA", Deep Learning Anti-Aliasing, a technique which is one of the most impressive, and fishy, aspects of Nvidia's demo.  

Why do I say fishy? Well, Nvidia's DLAA tech allows them to "render at a slightly lower resolution" (Jensen Huang) while using an AI-driven algorithm to gain back any lost detail and generate a higher detail final image. Nvidia's Tensor processing cores enable this feature. 

It could be argued that Nvidia is cheating here, though assuming that the final image is the of the same quality, or better, than what the Pascal system produces, Nvidia has a major win on their hands. The introduction of DLAA is why the Rasterisation and Shading sections of the above bar chart are shorter.

With DLAA, Nvidia has showcased the potential of AI to accelerate the production of high-quality images and video, though at this time it remains unknown how applicable this technology is to gaming applications. In theory, this could be Nvidia's answer to the high-resolution checkerboarding techniques that consoles like the PS4 use to offer higher resolution "faux-K" resolutions in modern console titles. The AI-driven nature of DLAA also has the potential to create less visual noise than today's console checkerboarding technology. 

 

    AI Accelerated by Powerful Tensor Cores

The Turing architecture also features Tensor Cores, processors that accelerate deep learning training and inferencing, providing up to 500 trillion tensor operations a second.

This level of performance powers AI-enhanced features for creating applications with powerful new capabilities. These include DLAA — deep learning anti-aliasing, which is a breakthrough in high-quality motion image generation — denoising, resolution scaling and video re-timing.

These features are part of the NVIDIA NGX™ software development kit, a new deep learning-powered technology stack that enables developers to easily integrate accelerated, enhanced graphics, photo imaging and video processing into applications with pre-trained networks.

 

Nvidia Turing uses RTX and DLAA tech to deliver a 6x performance boost over Pascal  

DLAA showcases the potential benefits of AI in future rendering and gaming applications, creating smart algorithms that could deliver most of the graphical detail we desire with significantly reduced amounts of compute resources. 

You can join the discussion on Nvidia's RT cores and DLAA technology on the OC3D Forums

«Prev 1 Next»

Most Recent Comments

14-08-2018, 14:04:34

NeverBackDown
Lowering quality automatically and then using math to make it seem identical in IQ? Good ol' Nvidia. Never change.Quote

14-08-2018, 14:33:01

Gothmoth
from my rendering point of view, using vray, i love RTX.


but this 6x faster claim over pascal, without any hard numbers, reeks like the typical nvidia BS.


i don´t like to sound like a broken record, as i wrote it a few times.
imo raytracing effects will be pushed into some new games by nvidia but will only become relevant for games in 2 years.


meanwhile a few AAA title will show some RT effects and slow down GTX 1080 and AMD cards so some people will pay the price for expensive RTX cards.


as long as there is not a big userbase with GPU that have raytracing hardware, game companys will not rush to implement raytracing into their games.Quote

15-08-2018, 09:07:34

Peace
Quote:
Originally Posted by NeverBackDown View Post
Lowering quality automatically and then using math to make it seem identical in IQ? Good ol' Nvidia. Never change.
If it seems or even IS identical, then I don't see the problem, at all. If it runs better while consuming less power, but looks exactly the same, then this is a huge win, imho.Quote

15-08-2018, 12:36:15

NeverBackDown
Quote:
Originally Posted by Peace Ð View Post
If it seems or even IS identical, then I don't see the problem, at all. If it runs better while consuming less power, but looks exactly the same, then this is a huge win, imho.
It better be nothing less than identicalQuote

15-08-2018, 15:59:08

TheF34RChannel
Quote:
Originally Posted by NeverBackDown View Post
It better be nothing less than identical
It must be, else there's no real sales pitch. Few more days ehQuote
Reply
x

Register for the OC3D Newsletter

Subscribing to the OC3D newsletter will keep you up-to-date on the latest technology reviews, competitions and goings-on at Overclock3D. We won't share your email address with ANYONE, and we will only email you with updates on site news, reviews, and competitions and you can unsubscribe easily at any time.

Simply enter your name and email address into the box below and be sure to click on the links in the confirmation emails that will arrive in your e-mail shortly after to complete the registration.

If you run into any problems, just drop us a message on the forums.