Search form

2017 Academy VFX Bake-Off: Finding the Five Motion Picture Nominees

From fiery photorealistic oil rig disasters and fractal geometry-based inter-dimensional mind-bending psychic excursions, to a digitally recreated Carrie Fisher as young Princes Leia, this year’s crop of ten best visual effects productions presented at the Motion Picture Academy represent another reminder of the breadth, depth and range of creative artistry and talent within the industry.

As if anyone familiar with the visual effects industry needs reminding, feature film VFX continues to dazzle audiences with an ever-increasing range of visually-driven stories inhabited by a dizzying array of digital characters, creatures and worlds. From fiery photorealistic oil rig disasters and fractal geometry-based inter-dimensional mind-bending psychic excursions, to a digitally recreated Carrie Fisher as young Princes Leia, this year’s crop of ten best visual effects productions presented at the Motion Picture Academy at this past Saturday night’s VFX Bake-Off represented yet another reminder of the breadth, depth and range of creative artistry and talent within the industry.

This year, the annual event ushered in a number of “firsts.” The Bake-Off was streamed live on the members section of Oscars.org, and a number of questions posed during Q&A sessions came from the online audience. Before, after and supplemental materials presented during the evening will be offered to all Academy voters, furthering the branch’s efforts to educate members about the visual effects work under consideration. Additionally, the Bake-Off itself saw a slight format change: a short onstage Q&A with the film’s representative team, moderated by branch governors Bill Taylor and Craig Barron, followed the five-minute introductions and 10-minute highlight reels presented by each film’s overall VFX supervisor.

Here’s a roundup of the 10 films and presentations in the order presented to the audience:

The Jungle Book (presented in 3D) – Walt Disney Studios

VFX supervisor and second unit director Rob Legato led off the evening introducing Jon Favreau’s The Jungle Book. Legato stressed the desire to make things look as authentic, real and alive as possible. The production’s computer “camera” never went anywhere a real camera couldn’t go, giving the audience the illusion they were watching a live-action film. He joked that the filmmakers duplicated the authenticity, the smell, the sounds and the lighting of the jungle by shooting at a warehouse and parking lot pool in downtown LA. Later, he joked that when he asked MPC VFX supervisor Adam Valdez how the original animation tests could look so fantastic so quickly, Valdez responded, “Because we used ray tracing,” to which Legato replied, “Wow, Ray is fucking awesome. We should use him on everything!” He did also mention that the ray tracing was tremendously slow, requiring 40-50 hours for each render.

The film was completely prevised, so the on-set puppeteer working with Neel Sethi, who played Mowgli, was always positioned properly, though a lot of dialogue and interaction with Sethi was ad-libbed. They built a 40-foot turntable with topographical variance so Sethi could be shot walking up and down, which drove a virtual camera and projector to simulate sunlight and shadow – he appeared to walk in and out of trees, shade and sunlight over a distance of hundreds of feet. This walk-lighting pairing was also synched with the lighting of Baloo, so the human and digital characters appeared walking together through the same forest.

He praised numerous members of the production team, including animation supervisor Andy Jones and Weta Digital’s Dan Lemon as well as Valdez and MPC environments supervisor Audrey Ferrara, explaining how he felt like Phil Jackson coaching a team of Michael Jordans, where his job was simply to hand the ball to Jordan and let him win. He reminded the audience to watch for the film’s brilliant and nuanced animation as well as the subtle, unseen VFX work in areas like the water sims, the fur, the hair, the bugs and how they came together in such a way you don’t even notice them – your belief is suspended as you’re taken in by director Jon Favreau’s story.

Captain America: Civil War – Walt Disney Studios

Production VFX supervisor Dan DeLeeuw began by mentioning how fun it was to get to bash the film’s actors together like action figures. The film introduced Black Panther, who, though shot live in a practical suit was ultimately replaced with an all-CG character, Giant Man, and a new Spider-Man to the Marvel Cinematic Universe. Though much of the film was done digitally, the filmmakers tried to make it appear as if everything was shot practically – no magic cameras and all previs done with correct lenses.

One of the biggest challenges was making the Atlanta shoot look like various global settings - the visual effects plate unit, nicknamed the “Swen Unit” led by VFX supervisor Swen Gilberg, used helicopters, drones and camera cars in places like Puerto Rico, doubling for Lagos, as well as in and around Berlin’s Paul-Löbe-Haus and Parliament buildings, where they received permission to shoot, but oddly, not to use the buildings in the film.

Dan Sudick and his team mixed practical and digital effects for the big truck crash scenes. DeLeeuw complimented Sudick on his ability to canon real trucks through the air and hit narrow target windows without destroying the surrounding sets. They also built a three-axis motion control rig of a full-sized helicopter used to film Captain America preventing the Winter Soldier’s aerial escape.

Lola VFX helped erase 30 years from Robert Downey Jr.’s Tony Stark for his holographic flashback scene. DeLeeuw voted for the Weird Science Downey Jr., but the studio held out for the Less Than Zero version. Lola shot a live-action double, who was comped into Stark’s digitally recreated form and face.

Industrial Light & Magic scanned and recreated Leipzig Airport completely digitally. ILM also created a new, emoting Spider-Man: they used Tom Holland’s motion-captured performance give his character irising lenses and a mask that moved when he spoke. Additionally, though they did end up filming on location at the Leipzig airport, they filmed on greenscreen, using digital recreations later for continuity purposes.

Passengers – Sony Pictures Releasing

VFX supervisor Erik Nordby highlighted the main challenges faced handling the film’s 1,400 shots. They referenced and recreated every image of space – real, artificially colored or highly stylized – they could find to help director Morten Tyldum develop and choose the film’s visual style and “space-scape.”

In one of the most technically demanding scenes, Jennifer Lawrence’s Aurora is caught swimming when the ship’s gravity goes off. Specific research and care was used to make the ensuing hovering water volume simulation as real as possible. After three months of sim wrangling, Nordby was pleased with the visuals. Tyldum, however, felt the simulation took attention away from the emotional pull of the scene, which was Aurora’s drowning. Simplifying the form of the volume while reducing the distortion and refraction of the water’s shading helped the water move in a slower, calmer manner that supported rather than distracted from the performance.

Though the goal of the scene was to use Lawrence’s real performance, complete previs, techvis and a full battery of digital scans had been created just in case. She ended up doing 10 30-second takes held underwater by straps to get the needed shots. It was noted humorously that Dan Sudick used his Captain America: Civil War production ties to get this production access the huge concrete Atlanta airport set – they dug up the concrete tarmac to build several pool sets.

Care was taken in the design of the ship, taking into account the physics of potential interstellar travel and artificial gravity. Additionally, a special motion control rig was built to whiz Michael Sheen’s artificial bartender back and forth at terrifying speeds.

Fantastic Beasts and Where to Find Them – Warner Bros. Pictures

VFX supervisor Christian Manz touted the freedom and care used to design and animate the unique forms and personalities of the film’s inventive array of magical creatures. They whittled down from hundreds of designs to achieve the film’s final roster, focusing on their clear desires and motivations as well as their realistic integration with the films’ 1926 New York settings. Director David Yates wanted the audience completely immersed in the film’s magical world settings – one challenging 10-minute continuous chase shot combined 15 unique CG beasts in 12 different environments, integrating the work of six different studios. In one of the scenes, an MPC “Billiwig” is eaten by a Rodeo “Doxy,” which in turn is eaten by an Image Engine “Fwooper,” all set with a Method environment.

Manz also spoke of the malevolent “Obscurus,” a mix of CG animation and simulations needed to capture the physical performance of actor Ezra Miller. Ron Perlman’s Gnarlack used keyframed animation based on the actor’s full facial performance capture.

Fifteen different areas of New York were recreated on a greenscreen-shrouded lot in the U.K., including downtown, Central Park, Times Square, Tribeca and the docks. Set extensions and digital replacements were then used for buildings, crowds, vehicles, trams and overhead railways.

Additional digital elements included owls, origami rats, house elves and Red, the lift attendant, as well as a mouthwatering but sadly CG strudel. The film’s finale involved the remarkable looking magical repair of a huge amount New York’s destruction at the hands of the Obscurus.

Deepwater Horizon – Summit Entertainment

VFX supervisor Craig Hammack pointed out to the audience that as a film depicting the chaos and danger of real life events that people had already seen on TV news, the VFX work had to look completely real – any obvious CG or odd looking effects would have meant disaster and completely ruined the audience experience. Key vendors working from a $20 million VFX budget were ILM’s San Francisco, Vancouver and London offices, Iloura in Australia and Hybride in Montreal, together creating 800 VFX shots of which 150 were fully CG.

Made without any help from the oil industry (no surprise there), who forbid them to shoot within 300 feet of any working rig, they mixed digital environments with full background replacements and extensions to physical sets built on three parking lots in New Orleans. One set, highlighting the work of special effects supervisor Burt Dalton, included a steel rig more than 60 feet tall weighing millions of pounds, which included an array of machinery, pumps shooting tens of thousands of gallons of mud and millions of gallons of water as well as numerous propane bars, poppers and mortars to tackle the film’s voluminous pyrotechnics.

In the end, the dirty, toxic and uncontrollable nature of the film’s fire elements meant much of the practically shot fire had to be replaced or significantly enhanced. Hammack noted that though digital fire has been done for years, normally we see it sporadically and in a supporting role. In this film, it played a starring role for more than 30 minutes of screen time – it had to be beautiful, terrifying, uncontrollable and cinematic. But above all…believable.

Additionally, the audience was taken inside an offshore oil rig, inside a drill pipe casing where they witnessed mudflow and pressure dynamics, a 250-foot mud blowout, a crazed mud-covered pelican flapping about a control room sequence and a 360-degree helicopter approach shot that was completely CG save for the helicopter itself. Iloura was commended for their work adding layers of embers, ash, smoke and haze to many shots, giving them the tremendous sense of heat the plate photography lacked.

Doctor Strange – Walt Disney Studios

VFX supervisor Stephane Ceretti highlighted the 1450 VFX shots done on the newest addition to the Marvel cinematic multiverse, where magic and other dimensions were central to the story. Method Studios took on Doctor Strange’s 2 ½ minute “Magical Mystery Tour,” a psychedelic cinematic sequence where he is thrown about the multiverse by the “Ancient One.” Prevised by Faraz Hameed at the Third Floor, the sequence seamlessly integrated Benedict Cumberbatch’s practical performance, a motion control stunt rig and a digital double detailed enough to support closeups of the actor’s pupil. Method Studios also setup much of the film’s beautiful magical effects, such as mandalas, rune shields and portals made of sparkling fire as well as the stunning recreation of Katmandu.

Framestore handled the astral projection affects, which included high resolution digital doubles of all the main cast used throughout the sequences. Ceretti explained how they created the look of the Cloak of Levitation, which was shared with other vendors. Integration with the live-action cloak was quite tricky, and Framestore’s animators did a great job handling the cloak’s artistic and fun performance. They also created a fully digital Kaecilius, played by Mads Mikkelsen, who gets trapped in a magical restraint system known as the Crimson Bands of Cyttorak, as well as a Mandelbrot fractal space deformation used in the film that was described as choreographed chaos.

ILM provided the New York chase scene, which included extensive spatial manipulation and deformation, as well as the Hong Kong time reversal sequence, a twist on the idea that a city must be destroyed at the end of every blockbuster film. All the destruction was reversed, everyone was un-killed and the world was saved. ILM spent weeks creating a detailed, highly textured version of both cities. Special motion-based tilting platforms were used in the difficult live-action shooting phase.

Luma handled three major sequences, including two where Doctor Strange confronted Dormammu in his domain, involving a recreated deadly black light poster environment complete with lethal organics and vast networks of planetoids. Dormammu was realized through a combination of fluid simulation and three-dimensional volume distortion all triggered by Cumberbatch’s facial performance.

Arrival – Paramount Pictures

VFX supervisor Louis Morin began with the idea that unlike a traditional aliens-land-on-earth visual effects extravaganza, Arrival was different – visual effects blended seamlessly with cinematography and production design to support the story invisibly. That included elements like the digital makeup and hair of Louise’s dying daughter, Louise boarding a helicopter where everything in the environment is CG, and the helicopter flying over a road blockade of hundreds of CG cars alongside thousands of CG humans. At the Montana campsite, a full digital army confronted the alien ship, a 1,500-foot egg-shaped meteorite perched like a rock standing on end. Across the film, digital tanks, trucks, helicopters, soldiers and people mixed seamlessly with live-action elements.

Regarding the ship’s almost simplistic design, Morin asked “That’s it?” when he first saw the early visuals. Director Denis Villeneuve wanted the ship to look like a piece of rock, textured in detail as needed for the given shots. No wings or lights or engines.

Shots showing humans boarding the spaceship required matching digital clouds, making sunny shots cloudy and of course, for the craft itself, a small amount of bluescreen, and an intense amount of rotoscoping.

The aliens were designed as a mix of an octopus, a giraffe and a whale. Extensive R&D was done to create their walk, their body language, the way they wrote and swam, as well as how the mist behaved around their bodies. And because this was not a film about aliens, the animation and effects needed to be very subtle – early animation tests involved too much motion, which was lost when viewed within their misty environment.

Morin shared how the Alien communication was key to the story. They spit ink that formed round shapes called “logograms” on the glass barrier inside the ship that separated them from the humans. Hundreds of simulations were testing until the filmmakers arrived at the right ink and water effect.

Action at the other spaceship sites, such as the Shanghai helicopter attack, the Russian Black Sea naval fleet and the armored tanks in the Sudan dessert were all CG shots mixed with stock footage. Mist, like the condensation that sometimes forms on airplane wings, wrapped around the spaceships as they seemed to evaporate and disappear, a unique story point they arrived at after much discussion. Morin added that the film’s iconic clouds spilling over the mountains shot was all real, a happy accident nature provided herself.

The BFG – Walt Disney Studios

VFX supervisor Joe Letteri began by sharing how Steven Spielberg made The BFG as a character-driven film, a story about a giant, played by Mark Rylance and a little girl Sophie, played by Ruby Barnhill. The goal was to shoot the film as if Rylance really was a 24-foot giant interacting with Barnhill, getting as much of his performance as they could in camera. Performance capture was used to create the giants – the filmmakers knew up front no practical elements would be possible. Though BFG was completely digital, the look and animation integrated Rylance’s performance as much as possible. All the giant country shots with BFG and Sophie were CG except where Barnhill was filmed against bluescreen. They built out the giant’s cottage as a serviceable set, so Spielberg, Barnhill and Rylance, whose performance was being captured, could work out their scenes as master shots. Those elements would then be brought into the virtual camera, where the director could plan out his shots and cameras he wanted for the sequences.

Letteri shared how they “hid the giant in plain sight” when he traveled to London, employing a series of gags that were choreographed with Rylance. These additions to the character’s performance were then integrated back into the story by screenwriter Melissa Matheson in what ended up being her last project before she passed away on November 4, 2015.

Weta Digital’s Guy Williams added that there were three scales used within the film and considerations of scale were always an issue. In scenes with little movement, Rylance was often motion-captured up on a riser, which was fed into the camera while the camera was shooting, so everyone saw a 24-foot giant talking to a four-foot girl. For scenes where BFG moved across the room, they employed a ball on a pole, an iPad on a pole or a 50-inch TV on a cherry-picker.

Rogue One: A Star Wars Story – Walt Disney Studios

VFX supervisor John Knoll began by acknowledging that the Star Wars franchise was hugely responsible for him getting into the visual effects business. He noted that Rogue One had to stay true to the feel and audience memories of the original films while showing them a new and expanded Star Wars universe – while some legacy material was matched nicely, the overall philosophy was that it was far more important to match the memory of how something was than how that something “actually” was. An optimum balance between practical and virtual meant close collaboration with Neil Scanlon’s creature shop and Neil Corbould’s special effects team.

More than 1,700 VFX shots were created, depicting a variety of new environments, vehicles, characters, blasters, explosions and destruction. The work was spread across all four ILM studios as well as 11 other vendors. Almost every scene had some type of environment extension, some with extensive environmental destruction. For example, the city of Scarif was a mix of location shooting in the Maldives, Bobbington Airfield in the UK and numerous synthetic environments. Jedha was similarly a mix of Jordan, Pinewood backlot and a lot of CG terrain and destruction.

Knoll explained how key character K-2SO, a former Imperial security droid, was played by Alan Tudyk using onsite motion-capture, fitted with articulated arm extensions and a pair of motorized stilts fashioned by Scanlon’s team. An all CG character, K-2SO’s performance was augmented by Hal Hickel’s animation team, who added eyes, fingers, and performance intent where Tudyk made facial expressions the character didn’t have.

Rogue One also includes two digital character recreations of real-life actors portrayed in the older films: two minutes and 32 shots with Peter Cushing’s Grand Moff Tarkin and one important shot at film’s end of Carrie Fisher’s Princess Leia. Knoll mentioned how demanding this type of digital double work is especially when depicting familiar faces. Though driven by facial performance capture, the most challenging aspect of the character’s creation involves matching the actor double’s facial movements to the facial movements of the digital character. Significant work went into what Knoll called “motion likeness,” where particular phoneme shapes were adjusted in an effort to keep the character on model. He also mentioned legacy characters Red and Gold leader were extracted from unused Episode Four dailies.

Knoll noted his team took a different approach to the usual challenge of placing meaningful lighting on digital characters when there is no set present. On three digital cockpit scenes, they built basic representative sets using similar shapes and colors to what the final digital sets would be. These gave the DP something to light, the actors something to relate to and the camera operator something to compose. The performers were then rotoscoped and placed into the CG sets.

The VFX team also built and scanned shiny plastic model versions of more than 300 pieces of equipment from the original movies, which formed a digital library modelmakers could use when creating ships for the film.

Kubo and the Two Strings – Focus Features

VFX supervisor Steve Emerson dove right in explaining how LAIKA’s film, though integrating a large number of digital elements and using a CG animation pipeline to drive the rapid prototyping facial replacement parts effort, is an intense collaboration of a large number of practical elements: puppet makers, set builders, camera teams, motion control operators, animators and visual effects artists using distinct visuals to make the timeless art of stop-motion animation. In essence, their studio pushes life into inanimate objects one frame at a time for the sake of telling a great story.

Kubo challenged audiences to believe that a group of puppets were happy, sad, hurt, emotionally devastated and ultimately transformed. Over 94 shooting weeks, LAIKA brought to life 108 puppets, frame by frame, at the hands of 37 stop-motion animators averaging 16 frames per day, or roughly three seconds per week. They also used state-of-the-art 3D printing technology to create facial replacement pieces allowing Kubo 48 million facial expressions, some differing by no more than the width of a human hair.

Giant puppets built for the film included the 11-foot underwater eye, an 880-piece fully 3D-printed moon beast and, in a tribute to Ray Harryhausen, a 16-foot fully moveable skeleton, the largest stop-motion puppet ever built, weighing in at 400 pounds.

Emerson described how the film is shot much like a live-action film – the sets are just smaller and the actors, shot on greenscreen, are puppets. Everything shot digitally is based on practical animation tests and physical materials. The studio knew, for example, that petroleum jelly or 3D replacement technology would not suffice for the film’s extensive oceans and water sequences. However, the VFX pipeline’s photorealistic water needed to feel like it belonged in Kubo’s universe – the CG was merged with a series of physical test materials and motion studies until, through an enormous number of iterations that took eight months, the team arrived at a “look” that fit the film.

By the numbers, Emerson’s in-house team of 60 VFX artists created 1,360 shots, integrating stop-motion performances in multiple scales with miniatures, puppet scale sets, digital extras, set extensions and other digital VFX elements.

Director of Rapid Prototyping Brian McLean finished the long but entertaining evening by explaining how LAIKA’s ground-breaking state of the art rapid prototyping system is anything but rapid. In a process first developed ten years ago for their first film, Coraline, CG animation of the puppets’ performances feed the rapid prototyping systems, a set of 3D printers that produce literally thousands of individual items then placed and replaced on the physical puppets.

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.