Heading into this coming Oscar Sunday on February 9th, catchup on AWN’s coverage of the films and creative teams nominated for this year’s Best Visual Effects.
If there’s a common element among the five Oscar nominees for Best Visual Effects, it would be the filmmakers’ desire to push the boundaries of practical and digital effects, whether by using virtual reality to compose the shots of The Lion King; de-aging Robert De Niro, Joe Pesci and Al Pacino over various time periods in The Irishman; producing fully CG characters such as Thanos and Smart Hulk in Avengers: Endgame; creating the illusion of one continuous shot in 1917; or modifying unused footage of the late Carrie Fisher so she could appear in Star Wars: Episode IX - The Rise of Skywalker. If the VES Awards are any indication, then it will be a battle between the regal Simba and hired gun Frank Sheeran for the privilege of appearing onstage at the 92nd Academy Awards. But then again, Thanos might snap his fingers to alter the outcome! In the meantime, here are summaries of the five worthy contenders.
Visual effects production led by Guillaume Rocheron, Greg Butler and Dominic Tuohy.
In an effort to immerse audiences in a World War I story inspired by his war veteran grandfather, filmmaker Sam Mendes decided to construct the narrative into a single shot cinematic experience. Even with all the meticulous planning, digital augmentation was needed to stitch different takes together. “Some of the sections that we stitched together were like five to seven minutes long and others were a couple of seconds long,” production VFX supervisor Guillaume Rocheron explains. “One thing I can say is, in the end we touched 91% of the frames of the film. If you look at the crossing of No Man's Land, there are some significant set extensions and a few stitches that are in plain sight.” Various camera rigs needed to be merged together. “Sometimes the camera had to be on a crane or on a vehicle or handheld on the Trinity Rig or on a Steadicam,” he continues. “The camera department did some amazing practical transition where they would lift the camera from being on the Trinity Rig, hook it to a wire and maybe pick it up on the other side. Other times it was just impossible; that's where we'd come in and do a transition there. For [director] Sam [Mendes], it was important that whenever we were doing CG camera work or CG lighting that I would always discuss it with Roger Deakins as he would look at it from the standpoint of the cinematographer. 1917 is not a movie you where you have cinematography and visual effects. Ultimately, you just don't want to see the difference.”
In one scene, British Lance Corporal William Schofield (George MacKay) jumps into a river to avoid being captured by German soldiers. “We shot it in a canoe training center and worked with special effects to get a lot of practical splashes and water around the main actor. But then we had to create the whole environment around it.”
Visual effects production led by Dan DeLeeuw, Russell Earl, Matt Aitken and Dan Sudick.
In Avengers: Endgame, performance capture was critical in producing both protagonist Thanos and Smart Hulk, where scientist Bruce Banner (Mark Ruffalo) successfully tames his inner green monster; under the direction of Anthony and Joe Russo and supervision of Marvel Studios VFX supervisor Dan DeLeeuw (Captain America: The Winter Soldier), production of the Avengers’ greatest foe was the responsibility of Weta Digital and Digital Domain, while the big green guy with glasses was handled by ILM. “We always find ways to improve,” notes Weta Digital VFX Supervisor Matt Aitken. “There were a couple of shortcomings with the facial rig of Thanos [from Avengers: Infinity War], particularly around the corner of the mouth, where we couldn’t completely achieve the range of expression that we wanted. We took the opportunity to get our facial modelers to work out how to increase the range of expression there. We also had this new tool called Deep Shape and that’s a way of adding more fine details to the facial performance.”
A solver called Anyma, developed by Disney Research in Zurich, allowed for better fidelity in translating the facial performance of Mark Ruffalo onto Smart Hulk. “We did a Medusa session with Ruffalo to get a whole set of facial shapes to drive our facial library for Smart Hulk,” ILM VFX supervisor Russell Earl explains. “It was a per frame match for the solvers that got retargeted onto Smart Hulk. The beauty of the improvement to the system was that it enabled animators to creatively go in and say, ‘His smile is too broad here. Let’s tweak it.’ You could dial out the solve and go full shapes or you could use a mixture of both.”
Visual effects production led by Pablo Helman, Leandro Estebecorena, Nelson Sepulveda-Fauser and Stephane Grabli.
In The Irishman, based on the 2004 book, “I Heard You Paint Houses: Frank ‘The Irishman’ Sheeran & Closing the Case on Jimmy Hoffa” by Charles Brandt, the film chronicles, over several decades, Frank Sheeran’s career as an alleged hitman for the Bufalino crime family. In the film, lead actor Robert De Niro, 76, had to appear as Sheeran at ages 24, 30, 36, 41, 50 and 65, with makeup taking over until he dies at age 83; Joe Pesci, 76, as mobster Russell Bufalino, whose age in the film ranged from 50 to 83; and Al Pacino, 79 as Teamsters union leader Jimmy Hoffa, whose ages ranged from 44 to 62.
Complicating matters was the decision by filmmaker Martin Scorsese not to put facial markers or motion capture head gear on his actors, which would inhibit their performance. This led to the development of a three-camera rig that had RED DRAGON with a Helium sensor in the centre and two infrared-modified ARRI ALEXA Minis acting as witness cameras placed on the left and right hand-side. “The circumstances of getting soft light in a regular motion picture setting is non-existent because everything is either side or back lit so you’re going to get really hard shadows,” production VFX Supervisor Pablo Helman shares. “If the software is looking for soft light then the only way to do it is to flood the actor with infrared light because that illuminates the shadow without contributing to the lighting of DP Rodrigo Prieto.”
It took two years to compile a library of the entire cinematic careers of De Niro, Pesci and Pacino. “We had thousands of frames and created an artificial intelligence program that would take one of our renders and go through our library to pick out frames that were alike to the ones we were rendering from all of their movies,” Helman adds. “It took in consideration lighting, position of the mouth and angles so we could get a sanity check for our work so that likeness would be where it needed to be. That’s the first time we used artificial intelligence in a show to start giving us feedback on the work that we’re doing.”
The Lion King
Visual effects production led by Robert Legato, Adam Valdez, Andrew R. Jones and Elliot Newman.
In The Lion King, director Jon Favreau’s fully digital world was creatied utilizing a virtual production methodology where everything, including the camera work, was grounded in reality. “I never tried to create a shot that literally the only way you can get it is in a computer,” says VFX supervisor Rob Legato, who previously collaborated with Favreau on the Oscar-winning The Jungle Book. “The audience is always seeing something that in even their imagination is similar enough to things they've seen before.” That made it important not to heavily art direct shots. “I believe that when you're trying to make something look real, you live with what God gives you essentially,” Legato notes. “I don't like things when they get overly art directed, because then they become too perfect.” Overseeing the cinematography was Caleb Deschanel. “The tools that Rob Legato and Magnopus developed, which were the camera, gearhead, fluid head, dollies, cranes and the ability to use a Steadicam, were just like the ones I’m used to working with except that it was in this virtual space,” he comments. “There were some tricky things that happened where the Steadicam operator could only see where he was on the monitor of his Steadicam. Jon said to me when he was talking to me about doing the film, ‘Listen, I want you to do the film because you filmed reality for the last 45 years, and you know what a real film is supposed to look like.’ If we followed an animal and missed the action a little bit that would sometimes be left in because we wanted the audience to have a sense that there was a person behind the camera making decisions.”
Star Wars: The Rise of Skywalker
Visual effects production led by Roger Guyett, Neal Scanlan, Patrick Tubach and Dominic Tuohy.
When Carrie Fisher died after the making of Star Wars: Episode VIII - The Last Jedi, the script for the Star Wars: Episode IX - The Rise of Skywalker had to be reworked, as her character, Princess Leia Organa, had a significant role in the production. “[Director] J.J. [Abrams] was very confident that we could create a digital realistic-looking Leia,” production VFX supervisor Roger Guyett explains. “What he did have an issue with is that inevitably the performance wouldn't have been authored by Carrie Fisher. We went through the outtakes from episodes seven and even eight, and made this huge database of all the lines that she had spoken that hadn't been used. Chris Terrio and J.J. would write the script using some of those lines.” To be useable, the chosen outtakes had to have the proper content and context. “You have to be able to use them in a way that you can actually stage a scene,” Guyett continues. “You have to feel like she is there in that moment. And so, we were very keen on using shots where she was moving, or the camera was moving, or we were panning to her, or doing something a little bit more complicated. We had to use motion control a lot of the time and had a laptop, where we'd have a version of the scene, so the actors could understand what they were working with. The thing that really made it even more complicated, but I think the end result is so much more satisfying, was giving her a new wardrobe, jewelry and hair. It was a sleight of hand in a sense that we used her face from previous footage but everything else was created digitally.”