Creative teams from 10 Oscar hopefuls present their contributions at the Academy’s annual Visual Effects Bake-off held on Saturday, January 6 at the Samuel Goldwyn Theater in Beverly Hills, CA.
On January 6, at the Academy of Motion Picture Arts & Sciences’ Samuel Goldwyn Theater in Beverly Hills, CA, leading figures in cinematic visual effects presented their contributions to the 10 films being considered for five 2018 Oscar nominations, to be announced across all categories on January 23.
Set up as a series of individual seminars, each film was presented with an opening pre-written segment by that project’s key visual effects supervisor, followed by a 10-minute reel of effects clips from the film, and then included a question-and-answer period, with questions provided by the hosts of the program plus select audience members. Each of the ten mini-seminars was hosted by Visual Effects Academy Branch governors John Knoll, Craig Barron and Richard Edlund, all veteran effects masters with decades of laudable work to their credits.
Inside the Goldwyn Theater, before the event began, electricity circled the room as visual effects artists and producers shared their methodologies, processes and approaches to creating the best motion-picture imagery of 2017, with a wide variety of film genres represented.
First in the itinerary was Christopher Nolan’s Dunkirk, a World War II tale wherein 400,000 British troops were trapped against French shores awaiting naval pickup while German planes attacked them from above. On behalf of the myriad artists who worked on Dunkirk, appearing at the Goldwyn were visual effects supervisor Andrew Jackson, Double Negative’s visual effects supervisor Andrew Lockley (absent due to work commitments), special effects coordinator Scott Fisher, and special effects supervisor Paul Corbould.
Undeniably, Nolan favored traditional techniques to execute Dunkirk, such as his insistence that the film was shot with IMAX film cameras at 48 frames-per-second, and was thus achieved photochemically with an optical finish. Additionally, as Jackson explained, many scale models were used to achieve the beachside effects throughout the story. “We built 38 quarter-scale radio-controlled planes,” he said. “We knew we were going to crash a whole lot of them deliberately into the sea, and film that. The biggest ones, the bombers, had an 18-foot wingspan. We had to tie in with the live-action, full-sized ships, and the cast and the extras in the water.”
The production team built a 120-foot section of a full-size ship on a gimbal for closeup shots of the soldiers jumping in the water and sliding down the ship. For shots of ships sinking, the production built four full-size interior sets on the Stage 16 tank at Warner Bros. Studios in Burbank. Computer-generated imagery was utilized to enhance these shots and others. “For the closeup shots of pilots,” Jackson revealed, “we built a cockpit on a gimbal, and we filmed it on a cliff next to the ocean in South LA, so we had all of the water, the horizon, and the sky in camera, and we just did CG extensions of the tail for those shots.”
For director Guillermo Del Toro’s The Shape of Water, the centerpiece effect was an amphibious male creature, executed with a full-size head-to-toe creature suit, facial makeup, and computer-generated elements. On hand to discuss the project were visual effects supervisor Dennis Berardi, creature suit supervisor Shane Mahan, digital effects supervisor Trey Harrell, and animation supervisor Kevin Scott.
Leading off, Mahan explained how the concept of the unnamed creature developed. Conceptual artist and sculptor Mike Hill created the creature’s design, and technicians at Mahan’s Legacy Effects fabricated the appliances, custom-made for Jones, which would fit on the actor’s full body and face. “The idea is that it is a makeup,” Mahan said. “It was determined early on we would do augmentation via CGI with eyes, like an amphibian and a frog, things found in nature. [Creature performer] Doug Jones’ performance was 90 percent of it, and then the final seasoning was done magically where we can’t in the world of physics do it.”
Berardi added that computer-generated imagery augmented Jones’ performance in the suit with “eye and facial performance, to subdermal bioluminescence, or fully CG underwater shots.” To that point, Scott offered, “The amount of research and study that we put into the eye blinks was extensive. We really wanted to capture what Doug was feeling inside as he was giving his performance, and trying to bring that to [the] surface for falling in love with Eliza [the lead female character, played by Sally Hawkins]. That study of micro-expressions and the blinks, we revisited numerous times, because they almost became a character themselves.”
For Alien: Covenant, Ridley’s Scott’s newest entry into the Alien saga that he ignited in 1979, visual effects supervisors Charley Henley, Ferran Domenech and Christian Kaestner spoke about their work on the project, with special effects supervisor Neil Corbould unable to attend.
Henley noted his director’s disposition towards the effects in an Alien film. “Ridley Scott’s approach to this gritty science fiction horror was to use real world references for every single detail,” Henley recalled. “This kept all our fantastical creations grounded in reality. Ridley likes to shoot stuff for real, and employed a lot of special and practical effects on this film, but he also loves a creative freedom the effects gave him -- to never compromise on his vision, which he communicates so well through his own famous storyboards.”
Certainly, any Alien movie must feature many manifestations of hideous creatures, and Henley communicated Scott’s penchant to mix live-action elements with computer-generated material. “Conor O’Sullivan did an amazing job of creating real practical prosthetic creatures for everything,” Henley said. “We agreed at the beginning we’d go very CG for the amount of motion that was needed, but [Scott] also insisted on having his practical versions to shoot, which was kind of a luxury that we don’t normally get, because that allowed us incredible references.”
Henley and Domenech shared that the success of the computer-generated creature work was largely based on the practical elements photographed on set, with most of the practical work eventually being replaced by digital effects. “All the animation [was] key-frame animation,” Domenech said. “We had an excellent team of animators in MPC [Moving Picture Company]. We had a very developed process with Ridley, where we looked at animal references -- we really had to reference everything from nature.”
Next, Guardians of the Galaxy, Vol. 2, James Gunn’s sequel to his first Guardians of the Galaxy film, featured a highlights reel with numerous fantastical creatures, environments and otherworldly realms offering a complex mixture of effects styles and techniques. Visual effects supervisor Christopher Townsend presided over the segment, accompanied by Weta Digital visual effects supervisor Guy Williams, Framestore visual effects supervisor Jonathan Fawkner, and production special effects supervisor Dan Sudick.
Townsend explained how Weta built a contiguous 3D set to render Planet Ego, requiring about half a trillion polygons for full realization of that environment. He added that certain sets in Guardians Vol. 2 were meant to be photographed in camera but were completely digitally replaced in post-production. Townsend further noted how Framestore redesigned the character of Rocket from the ground up for this new film, and four different companies animated him in their shots: Framestore, Weta, Trixter and Method. “The challenge was to create a singular performance amongst the four vendors,” Townsend reported, “with about 100 different animators -- and keep it consistent. We shared whatever we could and constantly compared shots to maintain just one Rocket.”
In one striking moment in the film, Kurt Russell appears in a flashback, apparently some three decades younger than his current self in the present-day world of the film. “We cast another actor who had the right bone structure and skin, particularly the right jaw and the lips, as a stand-in,” Townsend described. “Kurt would do the performance first, we’d put dots on his face, and we would shoot the shot, and then immediately we would have our other actor be watching it on monitors at video village just to clarify. Then, he would step in and he would do it, and we would repeat the camera movements as best we could. The purpose of the other actor was primarily as a reference, so that we knew what a younger person would look like, and what their skin would look like.” In special places in these shots, Lola FX worked frame-by-frame to take the chin and the neck off of the younger actor and molded them into the Russell plate.
Following Guardians Vol. 2, Kong: Skull Island presented a new take on an 84-year-old titular character, albeit one of larger proportions than the leading man in the 1933 film. Directed by Jordan Vogt-Roberts, Skull Island highlighted effects work by senior visual effects supervisor Stephen Rosenbaum, visual effects supervisor Jeff White, ILM animation supervisor Scott Benza, and special effects coordinator Mike Meinardus.
Prior to major effects work taking place, Rosenbaum revealed that serious consideration was given to Kong’s appearance. “A sensitive challenge for us was to figure out how to update the design, while still paying homage to the 1933 Willis O’Brien creature,” he said. “We chose to modernize his body, but maintain a familiar bipedal silhouette, and movie monster features of disproportionately large head and eyes, elongated teeth, and those beautiful, bulbous brows.”
As with previous screen gargantuans, Kong’s scale changed from sequence to sequence, though it was agreed upon early on among the effects team that his size would never vary from his chosen 120-foot height. “Not even two months into the process, he was up to 300 feet,” quipped Benza, “just because of our director’s vision for a particular sequence, where he arrives at the lake, where he encounters the squid. Our director had a very specific shot design in mind for that entrance.”
Among the film’s prominent effects accomplishments was a helicopter attack sequence which occurs 30 minutes into the film. Kong is revealed, first only with pieces of his head and body, then enters nearly full frame against a sunrise before encountering the choppers in a frenetic melee. Such a sequence would have been a foremost challenge with the stop-motion techniques available in the early-1930s and with the creature suit technology of the mid-1970s -- when the film was remade -- but was taken even further in its scope and approach with digital techniques in the 2017 film than in a similar biplane sequence in Peter Jackson’s 2005 version.
Third in a new trilogy of Planet of the Apes films, War for the Planet of the Apes presented the newest in motion-capture applications, even more noteworthy as performances were captured on practical locations, as they had been for the previous Apes film. Presiding over this process again was senior Weta visual effects supervisor Joe Letteri, plus visual effects supervisor Dan Lemmon, animation supervisor Daniel Barrett, and special effects supervisor Joel Whist.
Matt Reeves’ film posits the latest in photorealistic computer-generated characters, arguably initiated in The Abyss in 1989 and since honed on many genre projects. Apes’ key team noted that motion-capture performances were key to their success. “This movie continues in the tradition of capturing actors’ performances on location,” Lemmon said. “In this case, it was in blizzards on the tops of mountains or in the freezing rain of Canadian winter; then we used those performances to bring our digital apes to life. The apes spoke with greater articulation than ever before, and the actors, they delivered these nuanced performances that had layered emotions.”
To serve the actors’ captured performances, Lemmon noted that Dan Barrett’s animation team had to “find a new balance creatively between facial expressions that would be readable to a human audience, but also credible as real ape anatomy and movement.”
Letteri added that the performance capture worked in concert with Reeves’ sensibilities. “One the reasons we like to use it is we like working with actors,” said Letteri, stating that Reeves regularly took extra time to acquire his desired performances in-camera during principal photography. “He’s not one of those guys who says, ‘Well, we’ll fix it later.’ He treats everything like this is his only chance to get it -- as if his actors are his apes. We tried to make what we do an integral part of the filmmaking process, and having done this for three films now, everyone involved with him has kind of taken that on board.”
Luc Besson’s science-fiction epic Valerian and the City of a Thousand Planets was the seventh film of the evening presented at the Goldwyn. Based on the comic book series “Valerian and Laureline,” Valerian imagines a plethora of unique dimensions, characters, and worlds not previously explored in cinema. In point, production visual effects supervisor Scott Stokdyk remarked that Besson’s vision for the film had grown exponentially from 20 years prior when the director had imagined similarly unexplored visionary territory with his sci-fi feature The Fifth Element.
Also on hand to represent Valerian were Martin Hill, visual effects supervisor at Weta Digital, Philippe Rebours, visual effects supervisor at ILM, and François Dumoulin, visual effects supervisor at Rodeo FX. “Our job for the visual effects of the movie was to draw these images from inside Luc’s head and help bring them to the screen,” Stokdyk said of the 2,355 final visual effect shots created for the film. “Our mandate was that we could not use existing locations and enhance them; everything in the movie was created rather than curated, to make it not of our earth. That made it an asset-heavy show, both in practical builds and in CG, where we created hundreds of digital environments and characters that sometimes were only used in single shots.”
To initiate the process of visualizing the many elements required in Valerian, drawing extensively from the comic books, more than 1,500 concept artwork pieces were created -- a combination of work from the film’s production design department and work from each of the effects vendors, all of whom also had extensive art departments. While this work was being generated, in pre-production, Besson shot a rough version of Valerian’s key sequences using 60 students from a film school he established.
The third Star Wars film since Disney reinvigorated the franchise, Star Wars: The Last Jedi transported legions of fans into new territories while retaining many of the classic tropes which has made Star Wars a cinematic revolution across the past 40 years. At the Goldwyn, visual effects supervisor Ben Morris was joined by VFX supervisor Michael Mulholland, special effects supervisor Chris Corbould, and creature shop supervisor Neal Scanlan.
Written and directed by Rian Johnson, The Last Jedi continues the saga of mixing new characters with those from the 1977-1983 films, all in wholly new environments, save some familiar spacecraft and props. Morris offered that the collective approach on the film was to achieve as many in-camera effects as possible. “That was a surprise for me,” he confessed, “but a nice surprise. Shooting predominately in real world locations and day-lit exterior sets, giving settings an authenticity sometimes hard to create believably on stage.”
With over 2,000 effect shots created by a global team of artists, The Last Jedi, the eighth Star Wars film in the primary nine-part franchise, combined Corbould’s practical effects and rigging, Scanlan’s, practical puppets and props -- including a new Yoda -- with fully computer-generated characters carried over from the seventh Star Wars entry, including Snoke and Maz Kanata.
In the case of Snoke, performed by Andy Serkis, the character appeared 25-feet-tall in holographic form in Star Wars: The Force Awakens, though that would change in The Last Jedi. “He actually looked incredibly gelatinous and like a zombie in The Force Awakens,” Morris remarked. “And Ryan immediately said to me, ‘Bring it down to the real world. Make him look like a real person.’ We rebuilt the character from scratch. He had to be a human character, just like all the others. He came down to seven-feet-tall. Andy gave this amazing performance, as he always does, on set with the other actors, and that went into the cut.”
In the Korean fantasy Okja, a young woman strives to keep larger society from exploiting her exotic pet, a gigantic pig-like character she keeps in the country. For this film, directed by Joon-ho Bong, visual effects supervisor Erik De Boer addressed the audience, complemented by animation supervisor Stephen Clee, visual effects supervisor Jeon Hyoung Lee, and visual effects supervisor Jun Hyoung Kim.
Okja contained 730 visual effects shots with roughly 300 hero creature shots featuring the six-ton title character. De Boer described how Okja is on screen a total of 45 minutes with an average shot length of nine seconds, with all effects completed in 4K. “As we know from our own pets,” De Boer pronounced, “these human-animal relationships are deeply emotional and affectionate. We felt that to tell the story as realistically as possible, we had to embrace the fact that you cannot keep your hands off an animal so cute and so huge.”
To provide the best acting environment for the young woman in the story, Mija, Clee, and his team puppeteered Okja props to work with actress Seo-hyun Ahn in front of the camera. “Mija’s emotional connection with the props really brings our work to life,” De Boer added. “Okja is engineered to produce a lot of pork, but we wanted to portray her as luxurious and not grotesque. The R&D and proprietary technology that went into this creature was concentrating on making that story point in the most appealing way. You can see this work when you watch how her armpit and groin area is resolved, how the thickness of her skin varies, and how her organs slush and her muscles flex. We are very proud of Okja’s expressive skin and her anatomical and functional integrity.”
The final film presented at the Academy’s VFX Bake-Off was Dennis Villeneuve’s Blade Runner 2049. Set 30 years after director Ridley Scott’s 1982 classic Blade Runner, the new film paid due homage to the original while introducing wholly new characters, environments, and effects approaches. Visual effects supervisor John Nelson led the proceedings at the Goldwyn, augmented by contributions from special effects supervisor Gerd Nefzer, production visual effects supervisor Paul Lambert, and Framestore VFX supervisor Richard R. Hoover.
“Blade Runner 2049 had 1,190 visual effects shots in a film that ran two hours and 43 minutes,” Nelson said. “We had an hour and 48 minutes of visual effects. We wanted to honor the first movie, but make a different movie. Denis had a really clear vision. We set up some visual effects rules: build enough set close to camera so the actors really feel everything is real. We basically built as much set as we could afford to build, but we really needed to do elaborate sets, so lots of it is big CG shots; virtually every wide shot in the movie is a visual effects shot -- it’s either pure CG or CG extension. We wanted the visual effects to appear very photographic.”
Nelson added that, given Villeneuve’s aesthetic, Blade Runner 2049’s artists aimed for restraint in the visual effects. “We wanted to shoot as much practical as we could,” he said. “We wanted to limit the use of green screen, because [Villeneuve] doesn’t like using it, but he also thinks that it affects the lighting in the lighting realm. It’s also better for the actors to not be completely surrounded by process screens. We tried to do that as much as possible, using [practical] backings and replacing them when we needed to. Last thing for our rules was just to blend visual effects with photographic plates, and miniatures, and matte paintings.”
Where digital techniques were crude and cumbersome at the time of the first film, Nelson and the effects team gave Blade Runner 2049 a retro feel in many cases while adding photorealistic holograms of many sizes. In one striking scene, a holographic female presence physically intersects with an actual woman to create a third entity. “We shot both women separately, and mapped the plate on their CG geometry,” Nelson said of the moment in the film. “When the women merge, [their] eyes line up -- they form a third woman, and that third woman still acts.”
Another iconic moment in Blade Runner 2049 arises when Sean Young’s character from the 1982 film, Rachael, reappears in an eerily similar visage to her former self. “We did that by shooting a double on set and replacing everything from the neck up with CG head and CG hair,” Nelson recounted. “The CG head had minute detail down to blemishes for realism, and fly away hair for realism.”
With the end of the main presentations, before dismissing the crowd, Knoll requested that all present Visual Effects Branch Academy members vote for the five films from the evening deemed most deserving of an Oscar nomination. On January 23, all 2018 Oscar nominations will be formally announced.