Is NVIDIA’s New RTX Real-Time Ray Tracing Technology Really a Game Changer?

Industry analyst Jon Peddie weighs in on the significance of NVIDIA’s recently announced Turing GPU architecture and AI-enhanced photorealistic rendering.

In the weeks following NVIDIA’s August 13 announcement, and CEO Jensen Huang’s 90-minute SIGGRAPH keynote reveal two days later, of the company’s brand-new groundbreaking RTX real-time ray tracing platform featuring their Turing GPU architecture, much has been written about the new technology’s impact on various areas of photorealistic image production.

As if often the case with new CG technology, the sizzle and promise of each new product, over time, is tempered by the practical realities of the degree and speed with which other large technology companies embrace it within their own development plans, how it integrates with existing creative content production pipelines, platforms and toolsets, as well as whether or not various end user communities adopt and migrate in appreciable numbers. Slick videos and boastful PowerPoint decks aside, truly significant performance enhancements that entice early adopters, as well as acceptable pricing and upgrade paths that eventually woo the mass market, are essential to new product success.

While it’s far too early to even begin judging how RTX will fare in markets NVIDIA hopes to dominate, this new GPU technology clearly has the potential to fundamentally change almost every area of visual development, where advances in the speed and quality of real-time rendering can significantly impact production efficiency and iterative decision-making, let alone the “creative” process itself. Eliminating the need to wait minutes or even hours for renders would seem reason enough to make a change.

According to leading computer graphics industry analyst and market researcher, Dr. Jon Peddie, president of Jon Peddie Research, the new technology’s impact is significant. “Clearly, the real-time ray tracing announcements [at SIGGRAPH] are of great importance, spearheaded by NVIDIA and followed up by Microsoft with DXR [DirectX Ray Tracing], Chaos Group [Project Lavina], Pixar USD [Universal Scene Description] and others,” he notes. “This will continue to be a super-hot topic for the rest of the year.”

Peddie’s firm has for years been at the forefront of analyzing and assessing the technological and economic impact of computer graphics hardware and software products, producing quarterly and annual reports in areas such as GPUs, workstations, add-in boards, monitors and gaming hardware. In addition, JPR publishes the subscription-based TechWatch Report. Edited by JPR VP Kathleen Maher, these weekly reports provide timely coverage of relevant trends and pertinent new information about the graphics industry.    

Regarding RTX, Peddie goes on to explain, “NVIDIA’s proprietary RTX API exposes the special hardware they put into the new Turing chip, which will connect using the OptiX software. It’s pretty damn exciting and interesting. It’s a beautiful merger of processor technology with AI and the new GPU-based ray tracing software. Not only that, but they managed to pull it off 4-5 years earlier than I or anyone else thought they could. Including them!”

With regards to what segments of media and entertainment will be the fastest to make use of RTX technology, Peddie says, “Movie studios have been using ray tracing for over 12 years, with a frame rate of maybe 2 frames per day. This will potentially accelerate and increase that productivity, subject to Blinn’s Law [as processors speed up, images get more complex, therefore render times stay the same]. The second segment to benefit is gaming. Games have about the same development and production times as movies, so we won’t see immediate results…maybe in two years. Also, at SIGGRAPH, a small Israeli company, Adshir, was showing ray tracing at 10 GigaRays in an AR app on a mobile device. And of course, when it comes to science and engineering, think of beautiful buildings and automobiles.”

As far as how the new NVIDIA technology impacts its competitors, Peddie adds, “AMD will react. They already have an AI accelerator in the works which they announced at Computex. It’s expected out later this year. AMD has their own ray tracing software too, so you can expect to see them making real-time ray tracing announcements in Q4. They will likely use Microsoft’s DXR because in general, AMD prefers to go open source. Intel, which is expected to introduce a discrete GPU in 2020, may pull that in and make a surprise announcement in 2019. But even if they do, it won’t have AI acceleration.” With regards to Intel, Peddie went on to conclude, “They are in an awkward position of being a bit behind the curve with their product entry, which I would guess was part of NVIDIA’s motivation in this surprising announcement. Intel will probably be able to offer real-time ray tracing by 2021-2022 at the latest.”

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.