Search form

‘Avengers: Endgame’ and ‘The Irishman’: AI-Powered Visual Effects Come of Age

VFX studios on both films, Oscar nominees for Best Visual Effects, successfully used artificial intelligence in GPU-accelerated effects: ILM for character de-aging and Digital Domain for digital humans.  

On Sunday’s 92nd Annual Academy Awards, two films nominated for Best Visual Effects boast new state-of-the-art technological advances central to their film’s success: GPU-accelerated AI. For Martin Scorsese’s The Irishman, it was used by Industrial Light & Magic to de-age actors Robert DeNiro, Al Pacino and Joe Pesci. For the Russo brothers’ Avengers: Endgame, it was used by Digital Domain to create the fully digital villain, Thanos. Both studios pushed into new AI-enhanced storytelling territory through the use of NVIDIA Quadro RTX GPUs to accelerate production.

AI Time Machine

From the battlefields of World War II to a nursing home in the 2000s, and in every decade in-between, Netflix’s The Irishman tells the tale of hitman Frank Sheeran through scenes chronicling various periods in his life. But with all three lead actors - Robert De Niro, Al Pacino and Joe Pesci - in their 70s, a makeup department couldn’t realistically transform them back to their 20s and 30s. And Scorsese was against using typical motion capture markers or other intrusive equipment that gets in the way of raw performances during filming.

To meet this daunting challenge, ILM developed a new three-camera rig to capture the actors’ performances on set — using the director's camera flanked by two infrared cameras to record 3D geometry and textures. The team also developed software called ILM Facefinder that used AI to sift through thousands of images of the actors’ past performances. The tool located frames that matched the camera angle, framing, lighting and expression of the scene being rendered, giving ILM artists a relevant reference to compare against every frame in the shot. These visual references were used to refine digital doubles created for each actor, so they could be transformed into the target age for each specific scene in the film.

“AI and machine learning are becoming a part of everything we do in VFX,” said Pablo Helman, VFX supervisor on The Irishman. “Paired with the NVIDIA Quadro RTX GPUs powering our production pipeline, these technologies have us excited for what the next decade will bring.”

Building Better VFX Villains

The highest-grossing film of all time, Marvel’s Avengers: Endgame was a massive undertaking that eventually surpassed 2,500 visual effects shots. VFX teams at Digital Domain used machine learning to animate actor Josh Brolin’s performance onto the digital version of the film franchise’s villain, the mighty Thanos.

A machine learning system called Masquerade was developed to take low resolution scans of the actor’s performance and facial movements, and then accurately transfer his expressions onto the high-resolution mesh of Thanos’ face. The technology saved time for VFX artists who would otherwise have had to painstakingly animate the subtle facial movements manually to generate a realistic, emoting digital human.

According to Darren Hendler, head of digital humans at Digital Domain, “Key to this process were immediate realistic rendered previews of the characters’ emotional performances, which was made possible using NVIDIA GPU technology. We now use NVIDIA RTX technology to drive all of our real-time ray-traced digital human projects.”

RTX It in Post: Studios, Apps Adopt AI-Accelerated VFX

ILM and Digital Domain are just two of a growing set of visual effects studios and apps adopting AI tools accelerated by NVIDIA RTX GPUs.

In HBO’s The Righteous Gemstones series, lead actor John Goodman looks 30 years younger than he is. This de-aging effect was achieved with Shapeshifter, a custom software that uses AI to analyze face motion — how the skin stretches and moves over muscle and bone.

VFX studio Gradient Effects used Shapeshifter to transform the actor’s face in a process that, using NVIDIA GPUs, took weeks instead of months.

Companies such as Adobe, Autodesk and Blackmagic Design have developed RTX-accelerated apps to tackle other visual effects challenges with AI, including live-action scene depth reclamation, color adjustment, relighting and retouching, speed warp motion estimation for retiming and upscaling.

Netflix Greenlights AI-Powered Predictions

Offscreen, streaming services such as Netflix use AI-powered recommendation engines to provide customers with personalized content based on their viewing history, or a similarity index that serves up content watched by people with similar viewing habits.

Netflix also customizes movie thumbnails to appeal to individual users, and uses AI to help optimize streaming quality at lower bandwidths. The company uses NVIDIA GPUs to accelerate its work with complex data models, enabling rapid iteration.

Rolling Out the Red Carpet at GTC 2020

Top studios including Lucasfilm’s ILMxLAB, Magnopus and Digital Domain will be speaking at NVIDIA’s GPU Technology Conference in San Jose, March 23-26. Check out the lineup of media and entertainment talks and register to attend. Early pricing ends Feb. 13.

Source: NVIDIA

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.