Atomic Fiction Takes Flight
Director Robert Zemeckis winces when you ask him about his return to live action with Flight after working in performance capture for the past decade. It's all about "Truth and spectacle," as Francois Truffaut would say.
But it's the harrowing plane crash that opens the riveting story of an alcoholic pilot (Denzel Washington) forced to confront his inner demons, which required VFX enhancement. In fact, the shot count escalated from 130 to 400 when Zemeckis raised the jeopardy of this white knuckler.
"It's like everything I'd been doing the last 10 years was the perfect setup for doing Flight in every discipline," suggests Zemeckis. "I think it's a giant digital stew and it's just gonna end up being moving images and nobody's gonna care where they came from and how they were created. It's all going to go back to storytelling. It's a live-action component where you create the image using a lens coupled with another virtual component where it's all completely done in the computer; portions of practical sets with digitally painted augmentation; and painted on lighting."
Indeed, that's the perfect description of Flight: a combination of old school and new school techniques, which is why Zemeckis entrusted the VFX to an old ally from his ImageMovers Digital days: Kevin Baillie, co-founder of Atomic Fiction, which was able to handle the work in only four months, thanks to cloud rendering. "It's the New World Order," Zemeckis proclaims. "That to me is the most exciting news, which very few people know about, because they seem to be obsessed by trying to keep everything in these boxes."
Atomic Fiction offers a lower-cost business model utilizing cloud computing and other measures. Atomic has been working with a company called ZYNC to utilize Amazon's EC2 cloud services. By moving rendering to the cloud instead of owning the computers, they treat rendering like a utility and only pay for what they use. This means that rendering can literally be scaled from as many cores as you need for a particular job, back down to the Macs on the artists' desks between gigs.
"One thing I was really concerned with was that our numbers were predicated on being able to render all this stuff in the cloud, and I knew that with certain studios it was still a security taboo," recounts Baillie, who served as production VFX supervisor. "So I went prepared for my first meeting with Stephanie Allen, the visual effects executive at Paramount, armed with all this information, and , to my surprise, she was really excited about how much money it was going to save us. It's that kind of visionary and forward-thinking approach that helped make the visual effects for this film possible."
The initial challenge was shooting aerial plates in a helicopter on the first day of production without the benefit of the cockpit action to gauge. "But we had to get the plates covered and we only had one day, which then turned into half a day because of bad weather," Baillie continues." It was a real cart before the horse. So DP Don Burgess and the helicopter mount vendor came up with the idea of putting three synchronized Red Epic cameras, each with 14mm lenses, and10-degree overlap. It gave us a 240-degree, movable panorama. So with one pass we could accomplish anything Bob threw at us down the line when he shot the cockpit."
However, because they weren't able to get the full plate coverage, around 30% of the exterior crash sequence required extensive CG work. In addition, full CG cloud cover was required during the storm. "This points to one of our favorite tricks in conveying a final, cohesive product," Baillie adds, "layering matte painting with plate footage with CG clouds and dirt and fingerprints on the airplane window. But because you're not using one technique it keeps the audience guessing. I often equate that with a magician mixing up his act."