Robert Zemeckis and Kevin Baillie talk about the VFX challenges of the director's return to live action.
Director Robert Zemeckis winces when you ask him about his return to live action with Flight after working in performance capture for the past decade. It's all about "Truth and spectacle," as Francois Truffaut would say.
But it's the harrowing plane crash that opens the riveting story of an alcoholic pilot (Denzel Washington) forced to confront his inner demons, which required VFX enhancement. In fact, the shot count escalated from 130 to 400 when Zemeckis raised the jeopardy of this white knuckler.
" It's like everything I'd been doing the last 10 years was the perfect setup for doing Flight in every discipline," suggests Zemeckis. "I think it's a giant digital stew and it's just gonna end up being moving images and nobody's gonna care where they came from and how they were created. It's all going to go back to storytelling. It's a live-action component where you create the image using a lens coupled with another virtual component where it's all completely done in the computer; portions of practical sets with digitally painted augmentation; and painted on lighting."
Indeed, that's the perfect description of Flight: a combination of old school and new school techniques, which is why Zemeckis entrusted the VFX to an old ally from his ImageMovers Digital days: Kevin Baillie, co-founder of Atomic Fiction, which was able to handle the work in only four months, thanks to cloud rendering. "It's the New World Order," Zemeckis proclaims. "That to me is the most exciting news, which very few people know about, because they seem to be obsessed by trying to keep everything in these boxes."
Atomic Fiction offers a lower-cost business model utilizing cloud computing and other measures. Atomic has been working with a company called ZYNC to utilize Amazon's EC2 cloud services. By moving rendering to the cloud instead of owning the computers, they treat rendering like a utility and only pay for what they use. This means that rendering can literally be scaled from as many cores as you need for a particular job, back down to the Macs on the artists' desks between gigs.
"One thing I was really concerned with was that our numbers were predicated on being able to render all this stuff in the cloud, and I knew that with certain studios it was still a security taboo," recounts Baillie, who served as production VFX supervisor. "So I went prepared for my first meeting with Stephanie Allen, the visual effects executive at Paramount, armed with all this information, and , to my surprise, she was really excited about how much money it was going to save us. It's that kind of visionary and forward-thinking approach that helped make the visual effects for this film possible."
The initial challenge was shooting aerial plates in a helicopter on the first day of production without the benefit of the cockpit action to gauge. "But we had to get the plates covered and we only had one day, which then turned into half a day because of bad weather," Baillie continues." It was a real cart before the horse. So DP Don Burgess and the helicopter mount vendor came up with the idea of putting three synchronized Red Epic cameras, each with 14mm lenses, and10-degree overlap. It gave us a 240-degree, movable panorama. So with one pass we could accomplish anything Bob threw at us down the line when he shot the cockpit."
However, because they weren't able to get the full plate coverage, around 30% of the exterior crash sequence required extensive CG work. In addition, full CG cloud cover was required during the storm. "This points to one of our favorite tricks in conveying a final, cohesive product," Baillie adds, "layering matte painting with plate footage with CG clouds and dirt and fingerprints on the airplane window. But because you're not using one technique it keeps the audience guessing. I often equate that with a magician mixing up his act."
But with Zemeckis increasing the shot count, it required great flexibility during the second editorial turnover. The director would smile and say he was increasing the scope and came up with some great ideas such as the shots from outside the airplane where we're seeing the landing gear breaking and the fuel being dumped and the fire in the engine being put out.
Aside from the storm, there was also the feat of the plane flying upside down. "Don Burgess had lights moving around the cockpit as the plane is moving through space," the VFX supervisor relates. "It really helped to sell it. Mike Lantieri, the special effects supervisor, put together several different rigs to achieve the most sickeningly realistic effect of the plane being in a dive. This shaker rig jostled the plane around, which lent a sense of realism. And for the cockpit, there were two other rigs: a motion base that could pitch the plane 40 degrees in any direction, and a rotisserie rig, which rotated the cockpit and fuselage separately 180 degrees upside down. So everybody's actually hanging upside down during these shots."
As for the exterior shots of the plane turning upside down, Atomic used Maya and V-ray along with 3ds Max and V-ray for the matte work. The fire was done with Fume FX mixed with other particle chunks for the frozen water and co2. They also put oil spots on the lens for added grit. "That was part of a creative directive from Bob, who wanted a realistic movie, not one where the visual effects are a spectacle. Even the shot where the airplane crashes onto the field contains little fireball or flare. There's just enough to sell that the fumes caught on fire but they're gone very quickly. Most of it was just clouds of dirt and chunks of rock and smoke from the engines.
"Because of budget limitations, Mike Lantieri could only build a rig powerful enough to safely turn half of the fuselage of this plane upside down. This was a problem so we cut the fuselage in half and shot it in two passes. One where the camera was inside the back (we used that as a foreground plate), and another one where we had to back up the camera 40 feet and then shoot into the open end of a tin can for the distant part of the fuselage.
"And we had to seam the two in post without any motion control. We used some real time onset compositing to make sure the elements were close enough, but our compositors had to do a lot of work in Nuke with retimes and warps and tracking and stabilization to get everything to fit together. But at the end of the day, you never would've known that this plane was shot in three pieces during this crazy, rotating move. It's a great example of how special effects and visual effects work together."
Bill Desowitz is former senior editor of AWN and VFXWorld, the owner of Immersed in Movies (www.billdesowitz.com), a columnist for Thompson on Hollywood at Indiewire and author of James Bond Unmasked (www.jamesbondunmasked.com), which chronicles the 50-year evolution of 007 on screen, featuring interviews with all six actors.