Search form

'2012': The End of the World as We Know It

Uncharted Territory and other key VFX companies tell us how they created the cataclysmic destruction in Roland Emmerich's 2012.

Check out the trailer and clip from 2012 at AWNtv!

Uncharted Territory not only served as a vfx production company/hub, but also handled the most complex earthquake destruction. All images courtesy of Sony Pictures.

Roland Emmerich's 2012 has proven to be a monster hit, racking up $225 million worldwide its first weekend (including $65 million domestically). That's in large part due to the spectacular vfx turned in by Uncharted Territory and more than a dozen other companies, including Scanline VFX, Digital Domain, Double Negative, Sony Pictures Imageworks, Pixomondo, Hydrulx, Crazy Horse Effects, Evil Eye Pictures and Gradient Effects.

However, when Uncharted's Volker Engel and Marc Weigert wow the Academy and VES with their bakeoff presentations, their unique hook for overseeing1,315 shots of CG mayhem will be the complex nature of the integration: Not only is there "the shake, break and tumble," of a gigantic earthquake, but also the secondary involving, breaking glass, sparks, smoke, dust and debris. And then there's the tsunami to end all tsunamis and the pyroclastic ash clouds and lava bombs…

But perhaps the most significant part of 2012 is Uncharted's game changing role as a new "production shingle" model.

Indeed, as Jenny Fulle pointed out earlier this spring, Uncharted's new model may be a driving force for the future of this industry: "hiring key talent, setting up space, hiring a core crew and then outsourcing the majority of the work" and acting as a hub.

Of course, it helped that Engel and Weigert were co-producers as well. "It's both an advantage and disadvantage," Engel suggests. "The one disadvantage is that you have to recruit new people every time. But on 2012, it was a big casting process -- we interviewed about 300 artists -- and most of it I did myself, talking to each one of them and then handpicking the best.

Some earthquake previs also handled by Uncharted Territory.

"And then the rest are only advantages. You hire the exact people you want to hire, which has a huge impact on the budget. Believe me: Sony was reluctant in the beginning, having also Imageworks as a company. But it worked out great because we were able to give Imageworks 154 shots at the end for a big chunk of the third act."

However, Uncharted didn't want to give everything away and managed to act as the lead vendor, working on 433 shots: "So we set up a shop with 100 artists and we did the single most complex sequence in the whole movie: the first-half of the Los Angeles earthquake (the second half was done by Digital Domain)," Engel continues. "But actually being right in the middle of the destruction, with the street ripping open and all the buildings falling to the left and right of them, is something we wanted to do in-house because we had no idea how to do it when we first sat together."

What was also essential in setting up shop at Sony Pictures was creating a digital workflow, since they used a Panavision Genesis camera and needed a secure infrastructure for handling the myriad of shots. "What we did was build a 400-terabyte server that would immediately (via a DVS clipster) ingest any footage that was shot and converted to dpx files," explains Weigert. This system was used by the Uncharted crew and was also crucial in tracking incoming footage by the other vendors, too. Thus, Uncharted created a tool for the vendors to make sure the naming conventions were consistent and the correct footage was received and tracked.

"The unique thing about this production was that we not only had 100% realistic digital environments but also that they had to break 100% realistically as well," Weigert offers. It's a huge challenge already to make sure that your textures, your shading and lighting and everything is correct so that it looks real. But then to make that react to a 10.5 earthquake makes it literally 10 times more difficult to do because you have to build everything in multiple layers."

Scanline was able to provide quick turnaround as well as more directable water using the wave as a character rig.

Engel says that Scanline was able to deliver a completely new photoreal version of each shot within a couple of days. "That was something they promised in the beginning and actually delivered. So for Roland, it was a great experience in regard to this interactivity. He didn't have to totally guess what it might look like in the end and had something to look at that had white water and just looked great."

"We allowed Roland to direct his waves and he was very excited about that," recounts Stephan Trojansky of Scanline. We had to deliver 103 shots within three months and we split them up 80/20 between the LA facility in Marina del Rey and the German facility."

With its acclaimed FlowLine software, Scanline learned on The Chronicles of Narnia: Prince Caspian how to control water during the River God sequence where a polygonal rig controlled by a character animator drives the simulation so that it feels like water but is keyframed. "This time, it's not fantasy and can all be real world simulation," Trojansky suggests. "Just let the simulator do the work. But very quickly we realized that in Roland's world making such a movie with such gigantic dimensions that you couldn't follow nature. Otherwise, a shot would last 30 seconds that needs to last three seconds in a movie. So we applied our technique for The River God animation to what the title waves are doing. This time the character rig is the actual wave so that the simulator more directly controls the timing through that rig. This was very critical in being able to interact with Roland because there were a lot of iterations of shots where they would change the frame rate."

Digital Domain, under the supervision of Mohen Leo, worked on two sequences: the second part of the earthquake and Washington D.C. after it has been hit by a cloud of volcanic ash.

Digital Domain built a rigid body dynamics tools around a core engine called Bullet to create its earthquake mayhem.

"In order to show how massive these disasters are, many of the shots in the Los Angeles airplane escape sequence showed huge office buildings and entire city blocks collapsing and falling into the crack that opens up right under the city," Leo explains. "That meant that these were all large environment shots, and order to make them feel believable and give them scale, there had to be an enormous amount of detail in them.

"To make a collapsing city block feel realistic, you need more than just buildings. You need trees, cars, people, lampposts, traffic lights, fire hydrants and hundreds of other details. Each one of them had to be simulated to shake, break or tumble. And it doesn't end there. Each one of those objects would need secondary effects. A tumbling car needs breaking windows with glass shards, sparks, smoke, dust and debris.

"Another thing was clear from the start: all of this destruction had to be created entirely in CG. There was no off-the-shelf rigid body solver in any software package that could do what we needed to do, so we knew that we had to build our own system. We decided to build our rigid body dynamics tools around a core engine called 'Bullet,' which is an open source project and a very simple but fast and stable engine. Our software team then built a new rigid body simulation system around Bullet, adding our own techniques for shattering objects, creating constraints between the pieces, adding material properties (so some parts could behave like wood, some like glass, others like concrete and so on), and then running the simulations. We called this new rigid dynamics tool 'Drop.'

"Drop also gave the artists intuitive controls to weaken constraints in areas where they wanted major breaks to occur. This allowed them to choreograph simulations, so they could determine where a building should break, how large or small the sections would be, in which direction it should fall, etc. Finally, another huge advantage of drop was that it was extremely fast and stable. It allowed us to run simulations with thousands, sometimes tens of thousands of colliding pieces of geometry in an hour or two, so artists could get several takes out in one day and try variations until they found the best setup."

Double Negative's most complex work (led by Alex Wuttke) involved pyroclastic ash clouds and lava bombs at Yellowstone. The solution was to use pre-simulated libraries of pyroclastic ash-cloud elements, which were then dressed into the shot. The individual simulated elements were generated by Double Negative's fluid dynamics expert, Jason Harris, using Squirt, Dneg's in-house fluids solver. After much testing, he produced the ideal conditions within the simulator under which the team could produce a variety of pyroclastic flows in different shapes and sizes.

Double Negative tackled pyroclastic ash in different shapes and sizes.

Emmerich's brief for the lava bombs was disarmingly simple, that the lava bombs should vary in size from a football to that of a four story house!

As a result a lightweight rig was built in Maya for placement and timing approval of the bombs. Once approved, the data from these rigs was then dispatched to Houdini, where all the rigid and soft body simulations of lava-bombs impacting turf, soil and bedrock layers, along with all their interactions was solved. Then artists in Houdini would look at the trajectory of each bomb, calculate the point of impact on the ground and then run several simulations of lava bombs breaking through the turf area, into bedrock. In turn this bedrock would then break apart and rip through the turf layer and throw up a concoction of dust, smoke and lava that would be expelled for the impact. This was then parceled up and sent back into Maya for rendering, using Double Negative’s proprietary DNB and RenderMan.

Meanwhile, Sony Imageworks was primarily responsible for the interior station inside the Himalayan Mountains along with modeling and texturing the Arks where the lucky survivors get to start a new civilization.

Sony Imageworks focused on the Arks, utilizing the ray tracing richness provided by the Arnold renderer.

"We modeled everything in Maya and texture painted and passed on huge Photoshop files," says Peter Nofz, Sony Imageworks' vfx supervisor. "The biggest challenge was to actually fit all that geometry that was built almost into memory because it was quite a bit, and we tried to use matte paintings wherever we could to get our poly count down. We couldn't use too much proprietary tools because we had to pass our stuff on. Then we did use Arnold, our in-house ray tracer, as our renderer. And we threw more geometry at Arnold than ever before, so we had quite extensive render times. But at the same time, it allowed us to do all these little incandescent lights and an overall richness."

And what has been the impact of 2012 on Emmerich? According to Engel & Weigert, the director has a better understanding of how vfx can be art directed and controlled in a much more interactive way. This will certainly help in his next project: Soul of the Age, which tackles the controversial authorship of William Shakespeare's plays by suggesting that Edward de Vere, the 17th Earl of Oxford, was the true author. This may be a more intimate film with a lot less vfx, but Engel & Weigert are already hard at work trying to figure out how to virtually recreate 16th century London in Berlin. But that's another story.

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.