The Academy’s annual voting event showcases the year’s top visual effects-driven films.
This past Saturday night, the visual effects branch of the Academy of Motion Picture Arts & Sciences held its 2015 “VFX Bake-off,” the annual event where members view presentations of the top 10 visual effects-driven films of the past year prior to voting for the final five nominees.
A long but fun night, the nearly four-hour event, held at the Samuel Goldwyn Theater in Beverly Hills, CA, featured five-minute presentations followed by 10-minute screenings of selected finished shots from each of the shortlisted contenders announced back in December.
The Bake-off traditionally has brought together some of the industry’s greatest luminaries and artists, all there either to promote their own work or support the work of others. This year was no different, with Oscar-winning and -nominated VFX supervisors such as Scott Farrar, Paul Franklin and Richard Stammers in attendance.
The running order was selected partly by lot, as has been the tradition, and partly by the technical requirements of films being screened. This put the stereoscopic-3D movies at the front of the procession, starting with Peter Jackson’s The Hobbit: The Battle of the Five Armies, while the evening was finished with a (well-used) 70mm print of Christopher Nolan’s Interstellar.
The kitchen scene with Quicksilver in Bryan Singer’s X-Men: Days of Future Past earned huge applause from the crowd, and a chiding from the event MC, who asked that all reactions be held until the appropriate breaks in order to not unduly influence the voting members.
And until the nominations are announced on Thursday morning -- in a live two-part ceremony featuring J.J. Abrams and Alfonso Cuarón, natch -- 10 films remain in the running in the Visual Effects category for the 87th Oscars.
Along with clip reels, trailers and behind-the-scenes featurettes detailing the creation of some of 2014’s most spectacular imagery, here are the 10 shortlisted films, in the order in which they were presented:
The Hobbit: The Battle of the Five Armies
Weta’s Eric Saindon presented the reel for the final installment of The Hobbit franchise, observing that the movie’s 1,836 visual effects shots, comprising the majority of the film, were shot natively in stereoscopic 3D at 48fps. The film’s three major villains, including the dragon Smaug, were completely digital, as were most of the backgrounds and environments, and were animated using a combination of key-frame animation and motion capture.
Earning a laugh, Saindon noted that Smaug was fully key-frame animated, despite what might have been depicted on television by The Colbert Report. He also remarked that the project was the first feature to fully leverage Weta’s new production rendering software, Manuka, which allowed artists to achieve a much higher level of detail without resorting to shortcuts, and allowed complex battle scenes to be rendered in a single pass.
Weta also employed its effects package, Odin, for sequences using water, smoke and fire such as the destruction of Laketown, and deployed its new crowd software, Army Manager, to place as many as 60 high-res motion capture characters and thousands of digital background extras within a scene.
Digital Domain’s Carey Villegas presented the reel for Disney’s Sleeping Beauty-inspired tale, Maleficent. Prompting a round of enthusiastic applause, he observed that the film’s director, Robert Stromberg, embraces the use of digital effects and “in fact is one of us.” Maleficent employed a range of CG characters and digital environments, along with “at least a million” matte paintings. Under a small team that Villegas assembled, Digital Domain and MPC handled the bulk of the film’s effects, assisted by Method Studios and The Senate, with previs and postvis work from The Third Floor, and 3D stereo conversion by Legendary 3D and Prime Focus World.
“Overall, we really just wanted the visual effects to be simple and clean,” Villegas remarked. “We tried to maintain a beautiful look and a classic feel whenever possible.”
The first challenge for DD was to give Maleficent wings. “Her character starts as a young girl, so we had to make sure they worked with her as well as with Angelina later on,” Villegas said, noting that when fully extended her wings spanned 14 feet, and that when closed they rose above her head and shoulders. Other tasks included digital enhancements to Maleficent’s horns and eyes, and even the addition of blood flow inside the cheek implants worn by Angelina Jolie.
MPC created a variety of digital characters for the film, including digi-doubles for the three fairies also played by live-action characters within the film. MPC also created a CG version of Aurora, and the raven Diaval, who turns into man and then into a fully CG dragon for film’s climax.
X-Men: Days of Future Past
VFX Supervisor Richard Stammers presented the reel for 20th Century Fox’s X-Men: Days of Future Past. “To complete the 1,300 stereo shots we employed 12 vendors, with MPC and Digital Domain taking the lead,” he began, adding that the production also took a hybrid approach with 3D, shooting in both native stereo and mono, requiring 22 minutes of conversion.
MPC was responsible for the sections of the movie taking place in the future, including the fully-CG future Sentinels and their interactions with the mutant cast, their associated powers, and the environments in which they were placed. Digital Domain was responsible for the 1973 portions of the movie, comprised of the 1973 Sentinel and all of the environment work based in Washington D.C., which included the destruction of RFK Stadium and the White House, and DD also worked on the Mystique transformations and eyes.
The Pentagon kitchen scene where Quicksilver dispatches the guards under a deluge of fire sprinklers was a complex sequence blending action and humor. “This super slo-mo sequence started with a very detailed previs provided by The Third Floor followed by some extensive Phantom camera testing with the stunt and special effects teams,” Stammers recounted. “We maxed out at 3,200fps at 2K, but it still wasn’t slow enough. Camera movement and continuity demanded that we do the rest of the shots with dry backgrounds with the team at Rising Sun placed hundreds of cascading CG water droplets, all driven by Quicksilver’s movements, meticulously choreographed for our simulation.”
Transformers: Age of Extinction
ILM Visual Effects Supervisor Scott Farrar had the most humorous presentation of the night, poking fun at director Michael Bay’s penchant for over-the-top explosions while pointing out how far the bar had been raised for the effects for the reboot of Paramount’s toy-based film franchise.
“Transformers: Age of Extinction’ represented a new beginning for the franchise in virtually every way: a new cast, new locations and new characters created by visual effects teams with the most difficult and complicated work so far in the series,” he began. “What I do love about this film is that we got to be innovative and creative with shots using every on-set tool and postproduction tool available within the motion picture business.”
The film introduced a new type of robot, capable of “hypno-transformations,” that break into thousands of pieces and flow to reassemble, driven by a complex fluid sim. Twelve-hundred visual effects shots were completed by ILM, Base FX, Atomic Fiction, Whiskeytree and Method Studios.
“This was a movie of eights,” Farrar said. “There were eight new characters, eight months of shooting, and eight major locations, from Iceland to China. Eight acres of downtown Detroit dressed to look like Hong Kong. Eight formats: more than half the movie was shot in native stereo. And eight hours of dailies at the end of each shot!”
The dense number of elements in the larger shots was “staggering,” Farrar said, with incredibly complex setups. “Months in the making, and there’d only be one take,” he commented, outlining the recipe for creating the large-scale destruction demanded by the script.
Guardians of the Galaxy
Marvel’s feel-good comedy-adventure delighted summer audiences with a range of effects, including extensive CG environments and two entirely CG characters brought to life with keyframe animation. Method Studios’ Stephane Ceretti, who served as the film’s overall VFX Supervisor, said the movie contained 2,400 shots, with roughly 95 percent of them relying upon visual effects.
“Rocket, a CG talking raccoon, and Groot, a CG walking and talking tree, had to be as real as any of the other live-action characters in the film,” Ceretti said, describing the two characters as the heart and soul of the movie.
Framestore was tasked with building Rocket, and MPC was responsible for building Groot. Given the differences in each facility’s pipeline, the most difficult challenge Guardians presented was blending the two characters together. Accounting for 550 shots of the final cut of the film, MPC and Framestore divided the Rocket and Groot shotwork, sharing assets as needed. Instead of using motion capture, the production decided early on that it would keyframe animate the two CG characters, relying on the artistry of the animators to bring the performances to life.
Space, space ships and space battles were heavily featured throughout the film, which was conceived as a space opera by writer and director James Gunn. “First we had to create a full universe where our story could take place,” Ceretti recounted. “James wanted all the environments to have lots of saturated colors, with distinct contrast between the ugly and the beautiful.”
Visual Effects Supervisor Jim Rygiel presented the reel for Warner Bros.’ remake of Godzilla, directed by Gareth Edward. Rygiel quipped that he wasn’t sure which would be more difficult, creating a 600-foot monster, or working under a former VFX Supervisor. “Some of Gareth’s first words, and his mantra throughout the film, was that he wanted to be able to stop on any frame and make a movie poster,” he said. “Right there I knew we were talking more about the art, and what it was going to look like, opposed to the technicalities involved in building it.”
Major sequences included the destruction of the Janjira power plant, a tsunami, and the destruction of San Francisco’s Golden Gate Bridge. Originally filmed in a parking lot in Vancouver using only cars, the bridge sequence featured CG set extensions and the addition of hundreds of digital cars created by Guillaume Rocheron and MPC.
Rygiel described the Halo Jump sequence, a mix of practical effects and CG, as his favorite. To create the graphic, signature look of the sequence, skydivers equipped with helmet-mounted cameras were roto-ed out, the entire environment was replaced, and red smoke trails were added. Cloudscapes were created using multiple layers of clouds, and the smoke trails were formed using fluid simulations. A CG San Francisco and numerous digital soldiers helped finish the shot.
Captain America: The Winter Soldier
The reel for Marvel’s Captain America sequel, directed by Anthony and Joe Russo, was presented by VFX Supervisor Dan Deleeuw. Captain America: The Winter Soldier features 2,500 VFX shots, with roughly 900 of them handled by Industrial Light & Magic. Other VFX houses that worked on the film include Scanline, Trixster, Rise and Luma.
The effects created for The Winter Soldier comprised numerous digital doubles -- including close-up shots of lead characters Falcon and Black Widow -- CG environments, CG helicarriers built from the ground up, and digital re-creations of Washington D.C., where much of the film’s action takes place. For the Quinjet scene on the Theodore Roosevelt Bridge, live-action sequences were filmed in Cleveland and then composited into an entirely CG version of the bridge created using textures from high-res reference photography.
“When I first met the Russos, it was a conversation not so much about superhero films; it was more about 1970s thrillers,” Deleeuw recounted. “We wanted to approach the film with more of a grounded feel. We had a common language between films like Three Days of the Condor, Marathon Man and The French Connection, and we wanted to apply that practical, grounded style to our film.”
The emotional scene toward the end of the film featuring a 92-year-old Agent Peggy Carter, played by (a very young) Hayley Atwell, also presented a major challenge. The effect was achieved by projection mapping the skin of an elderly woman onto Atwell’s performance, with digital artists lining up facial features such as the lips and cheeks.
Dawn of the Planet of the Apes
Weta Digital VFX Supervisor Dan Lemmon introduced the reel for director Matt Reeve’s Dawn of the Planet of the Apes. The success of the $170 million feature from 20th Century Fox was dependent on believable, emotive characters -- that happened to be entirely CG-generated.
“At its core, Dawn of the Planet of the Apes is a character-driven film,” Lemmon began. “Every major character faces a difficult decision that will determine the survival of their kind. Because many of those characters are apes, the visual effects team and the animators had to create apes that were as believable as the human characters…both their appearance and performance had to be 100 percent realistic.”
To help create believable characters, Weta artists studied ape physiology, going beneath the skin to create physically-based skeletons and muscle systems. Weta also developed a new dynamics-based fur system, retooling its pipeline in an effort to achieve more realistic fur.
To help get the best performances from the apes and the humans in the movie, Weta developed an on-set performance capture system and shooting methodology where the actors could work together on set regardless of whether they were playing apes or humans.
“Because the apes in this movie talked, we needed to a lot of face shapes that apes wouldn’t normally make, and pushing that muzzle around in so many different shapes meant that the skin needed to bunch up and then release and pull out in a convincing way that didn’t lose volume or look rubbery,” Lemmon related. “Some of the most sophisticated skin animation happened around the eyes. We rebuilt the eyelids and skin around the eyes for all the major characters to include more detail and also to incorporate some of the actors’ likenesses into the apes’ eyes and brows.”
Night at the Museum: Secret of the Tomb
Visual effects supervisor Eric Nash presented the reel for the only other comedy feature to be shortlisted for the VFX Oscar, 20th Century Fox’s Night at the Museum: Secret of the Tomb, directed by Shawn Levy.
“Our biggest challenge on Night 3, but also what made this project so satisfying was dealing with the tremdous range of types of visual effects,” Nash commented. At the top of the list was the multitude of creatures the VFX team had to bring to life. Most of the established creatures returned for the third installment of the franchise, including the T-Rex skeleton and capuchin monkey, but the team also had to create constellation characters that come to life in the planetarium wing during the first-act gala.
The bulk of the film takes place in British Museum, which features its own dinosaur skeleton, a Triceratops. “As with the T-Rex, our animators gave the Triceratops its own distinct personality without any facial capability,” Nash said. Dozens of unique sculpted creatures made from innumerable different materials are brought to life for the film, including an enormous bronze statue of Xiangliu, a nine-headed serpent whose scales don’t warp or stretch.
In addition to numerous CG set extensions, a handful of fully-CG environments were created for the film. “The miniature characters, Jed and Octavius, needed a heating duct environment that was a dead match for the practical ductwork used for shooting Dexter the monkey,” Nash recounted. “Also for Jed and Octavius, we built a CG diorama complete with erupting Mt. Vesuvius and threatening lava flows. But because it was at their scale, it had to be modeled, textured and lit like a miniature.”
The VFX team chose to treat all the Jed and Octavius scenes differently for Secret of the Tomb than it had for previous installments: “Rather than going the macro photography route, we opted to treat their scenes as if shot by a camera crew of their scale with the corresponding depth-of-field,” Nash explained, adding that the look was achieved in practical environments with a technique called focus-stacking, which allows the focal plane to be adjusted during compositing in order to better integrate the characters into the scene.
After the sharp digital clarity of the preceding nine films, the reel for Christopher Nolan’s space epic felt almost comfortably worn. VFX supervisor Paul Franklin, a longtime collaborator of Nolan’s, made the presentation, commenting that the director wanted to do the film in as few shots as possible.
“But fortunately for all of us here tonight, the giant science fiction show about space ships, black holes and wise-cracking robots actually turned out to be quite a major coordinated effort between both visual and special effects,” he gently teased, adding that the physical reality of location shoots, practical sets, and in-camera techniques were at the heart of the process.
Special effects teams in Alberta created a swirling dustbowl for the storm sequence, while in Iceland a full-sized model of the spacecraft was loaded by crane to the ocean surface. In California, the production built huge sets of spaceship interiors mounted on special effects gimbels “so large that they actually resembled sections of a major road bridge more than anything else,” Franklin said.
VFX extended the location photography to add mountainous waves of water on the water planet, and Icelandic glaciers were built out to create the harsh, frozen landscape of Mann’s planet, Franklin said. “Mother Nature also helped us out a little bit with a 100 mile-an-hour wind storm, which blew us off the ice and stripped paint from our cars and asphalt from the roads, and if it looks tough in the movie that’s because it actually was,” he added.
Other effects were added in-camera, setting aside all thoughts of green screens with the use digital front projections of space, planets and other sorts of visuals. “The Earth’s environment was a very big hit with our cast who didn’t just have to imagine the super massive black hole and spacecraft -- they could actually see it -- but it did mean that VFX had to ramp up very early in preproduction in order to get content ready for filming,” Franklin said. Twin projectors were converged onto the same area of screen to get enough exposure and mounted on a 100-foot crane the team could move around and project onto the 300-foot screen they had built on stage outside of the main stage of the Endurance’s interior.
“As I explained to Chris, it usually takes a couple of days to get a projector adjusted and online, and after listening patiently he cut that to 20 minutes between setups, which amazingly the team actually managed to do,” Franklin said, adding that the effort was worth it, with many shots requiring a minimal postproduction and 170 shots were captured entirely in-camera. “Do you remember when people used to build model space ships and they looked really good?” Franklin asked. “Well, as it turns out that they still look really good.”
Oscar nominations will be announced on Thursday, January 15 at 5:30 a.m. Pacific at the Academy's Samuel Goldwyn Theater.
The 87th annual Academy Awards ceremony will be held on Sunday, February 22, 2015, at the Dolby Theatre at Hollywood & Highland Center in Hollywood.
Jennifer Wolfe is AWN’s Director of News & Content