2012: The End of the World as We Know It
What was also essential in setting up shop at Sony Pictures was creating a digital workflow, since they used a Panavision Genesis camera and needed a secure infrastructure for handling the myriad of shots. "What we did was build a 400-terabyte server that would immediately (via a DVS clipster) ingest any footage that was shot and converted to dpx files," explains Weigert. This system was used by the Uncharted crew and was also crucial in tracking incoming footage by the other vendors, too. Thus, Uncharted created a tool for the vendors to make sure the naming conventions were consistent and the correct footage was received and tracked.
"The unique thing about this production was that we not only had 100% realistic digital environments but also that they had to break 100% realistically as well," Weigert offers. It's a huge challenge already to make sure that your textures, your shading and lighting and everything is correct so that it looks real. But then to make that react to a 10.5 earthquake makes it literally 10 times more difficult to do because you have to build everything in multiple layers."
Engel says that Scanline was able to deliver a completely new photoreal version of each shot within a couple of days. "That was something they promised in the beginning and actually delivered. So for Roland, it was a great experience in regard to this interactivity. He didn't have to totally guess what it might look like in the end and had something to look at that had white water and just looked great."
"We allowed Roland to direct his waves and he was very excited about that," recounts Stephan Trojansky of Scanline. We had to deliver 103 shots within three months and we split them up 80/20 between the LA facility in Marina del Rey and the German facility."
With its acclaimed FlowLine software, Scanline learned on The Chronicles of Narnia: Prince Caspian how to control water during the River God sequence where a polygonal rig controlled by a character animator drives the simulation so that it feels like water but is keyframed. "This time, it's not fantasy and can all be real world simulation," Trojansky suggests. "Just let the simulator do the work. But very quickly we realized that in Roland's world making such a movie with such gigantic dimensions that you couldn't follow nature. Otherwise, a shot would last 30 seconds that needs to last three seconds in a movie. So we applied our technique for The River God animation to what the title waves are doing. This time the character rig is the actual wave so that the simulator more directly controls the timing through that rig. This was very critical in being able to interact with Roland because there were a lot of iterations of shots where they would change the frame rate."
Digital Domain, under the supervision of Mohen Leo, worked on two sequences: the second part of the earthquake and Washington D.C. after it has been hit by a cloud of volcanic ash.