Search form

'10,000 BC': To Furtility and Beyond

Alain Bielik tames the prehistoric beasts in 10,000 BC after speaking with the vfx vets at Double Negative and MPC.

10000BC01_DT_0030_cc_v024_2.jpg

Double Negative and MPC created more than 300 vfx shots for 10,000 BC. Double Negative, known for its environments and invisible vfx, were thrilled to delve into creature animation on this one. All images courtesy of Warner Bros.

The last time Visual Effects Supervisor Karen Goulekas collaborated with director Roland Emmerich, her team ended up garnering an Academy Award nomination. The Day After Tomorrow eventually lost to Spider-Man 2, but the hundreds of vfx artists who have been working on 10,000 BC for the past two years certainly hope that Emmerich's new epic movie (opening March 7 from Warner Bros.) will enjoy a similar fate. Once again, the director's demanding vision led the team to push the boundaries of visual effects technology, especially in terms of fur simulation. Double Negative (170 shots) and MPC (150 shots) shared the bulk of the workload. Machine and The Senate Visual Effects provided additional effects, while Nvizage handled previsualization.

At Double Negative, Visual Effects Supervisor Jesper Kjolsrud oversaw a team that included Compositing Supervisor Andy Lockley, CG Supervisors Alex Wuttke and Rodney McFall and VFX Producers Dominic Sidoli and Stuart McAra. "One of the most exciting things about this project was that it allowed us to delve into creature animation on a very large scale, Wuttke recalls. We were in charge of creating the saber tooth tiger and the terror birds. Double Negative is mainly known for its work on environments and invisible effects. So, 10,000 BC came as a perfect opportunity to stretch our muscles in the animation realm. Coincidentally, while working on this movie, we were also creating Grawp for Harry Potter and the Order of the Phoenix. Both movies were a turning point for us."

The first step was to build a creature pipeline that could withstand the needs of the project. Maya, associated with many proprietary tools, formed the basis of the platform, along with RenderMan and Shake. "We invested a lot of time into developing a proprietary fur system from the ground up," Wuttke notes. "We wanted to have the flexibility to groom the fur exactly the way we needed."

Taming the Tiger

Double Negative started by scanning detailed maquettes provided by Patrick Tatopoulos Studios. The design was then adapted to better fit the requirements of the story. The team did some research on actual saber tooth tigers, but noticed that the creature design noticeably deviated from their true anatomy. "The tiger had to look absolutely photoreal, but it was definitely a fantasy creature," Wuttke comments. "Our saber tooth tiger is actually a cross between a lion and a tiger. Parallel to the fur challenge, the development of the creature required a extensive amount of R&D to be able to simulate the movement of the skin sliding over muscles and bones, a distinctive feature of the big cats. On Grawp, we handled it very much as a rigging problem. On 10,000 BC, we approached it more as a simulation problem. In effect, we used a very high-resolution cloth simulation to get the right roll and slide of the skin over the musculature and skeleton. Our muscle TDs went to great lengths to study the proper physiology and anatomy of real tigers to see how the muscles worked together, what happened when certain muscles fired."

Animation Supervisors Rob Hemmings and Eamonn Butler and their team extensively studied tiger behavior to reproduce typical big cats movements on the CG tiger. Rendering was handled procedurally through custom written RenderMan DSO plug-ins that were tightly associated with the fur system. The amount of fur being generated in each frame turned out to be enormous. Instead of writing all that out with temporary files, the team wrote out certain follicle files that described the general characteristics of the areas that were being shaded, including groom.

The team was able to control the level of detail in each shot. "We could preview the fur at low resolution or render it at maximum resolution for tight shots," Wuttke says. "But we noticed that, when the tiger was further away in a shot, the full resolution no longer looked like fur at all. So, we had to adjust the level of detail every time in order to get the right look for the fur. The most challenging shots were part of a scene in which the tiger moves in a watery pit. We had to combine fur simulation and water simulation, two of the most difficult things to do in CG. This sequence alone required a huge R&D effort in itself. Basically, we had a base simulation for the water which was done with the furless tiger animation. Then, when we added the fur, it was dynamically moved about by the water simulation. So, in effect, the fur was simulated by the simulation of the water. It was kind of complex at times "

The watery pit shots were rendered in several passes. Instead of writing out a single beauty pass for the water and another one for the fur, and try to put them together, the team essentially wrote out data passes for the water that included channels which gave refraction, reflection, dissipation, etc. Then, they would write out fur passes that included many different specular models. The various passes were then assembled within Shake. "It gave us a lot of flexibility to adjust the final composites," Wuttke notes. "For instance, if the director wanted to see more fur underwater, we could just change that parameter within Shake."

Double Negative also worked in more familiar territory with the creation of the construction site of the great pyramids at Giza, Egypt. A gigantic miniature was used for live-action plates and formed the basis for all of the shots. 

Birds with an Attitude

Parallel to this effort, another huge development process took place for the terror birds, fictitious creatures that look like ostriches on steroids. The team combined the fur system with a feather system to create the specific look of the creatures. Animation was mainly based on ostrich's behavior, but because of the articulations of the bone structure, the creatures were obliged to move in a certain way, which turned out to be quite a challenge for the animators.

Double Negative also worked in more familiar territories with the creation of two different environments: the Nile River, and the construction site of the great pyramids at Giza, Egypt. The great river was generated in a shader and integrated into live-action plates of the desert. Digital boats with CG sailors and cloth-simulated sails were also added in the shots. For the Giza shots, Double Negative was provided with live action plates featuring a gigantic miniature set built by Magicon in Germany, and filmed on location in Namibia. "The miniature was absolutely spectacular," Wuttke marvels. "It formed the basis for all of our shots. We added thousands of digital extras generated within Massive. We did an extensive motion capture shoot that provided us with a library of all the movements that we needed. We also added a large amount of digital props to the filmed environment. They comprised of various pieces of scaffolding and blocks around which our digital action could take place. This also helped us to sell the scale of these shots, as did the addition of digital dust all over the place. We created it using DnB, our proprietary volumetric renderer."

One Location, Two Vendors

Visual effects duties on the Giza sequence were shared with MPC. Visual Effects supervisor Nicolas Aithadi's team there included CG Supervisor Guillaume Rocheron, Compositing Supervisors Steve Sanchez and Richard Baker, Animation Supervisors Greg Fisher and Adam Valde and Visual Effects Producer Christian Roberton. "The decision on who was doing the shots was based on the mammoths," Aithadi explains. "If they were small enough in the frame not to require fur simulation, Double Negative did the shots. We provided them with mammoth animation files, models, final renders, etc. When the mammoths required fur simulation, we did the shots, crowd included."

MPC started as early as 2005 by creating a test of mammoth animation for production. Software Development Supervisor Damien Fagnou and his team went straight into a major R&D effort to craft the creatures' unique fur. "We developed a new fur simulation tool called Furtility," Aithadi says. "It was blended in our Maya/RenderMan/Shake pipeline. We had a lot of difficulties nailing down the exact fur look that the director was after. The mammoths had to be totally different from elephants, but not look fluffy at all either. We needed to emulate the properties of the matted, tangled and dirty hair of a creature living in a rough environment. We ended up using over 600 different textures to obtain the right look! Many of the hairs reached seven feet long (two meters), which was very difficult to control."

MPC went straight into a major R&D effort to craft the wooly mammoth's unique fur. They developed a new fur simulation tool called Furtility.

A basic mammoth model was built from a scan of a Tatopoulos maquette, and then completely redesigned in the computer. Head of modeling department Andrew McDonald and his team built a male, a female, a young mammoth and a baby, before creating four variations of each model: enough to build up a whole herd. Previs had determined that no more than 110 mammoths would be featured in any single shot. The herd was animated using MPC's in-house crowd system ALICE. "One of the big evolutions of ALICE on 10,000 BC was its capacity to simulate crowds using hero character assets," Rocheron explains. "We were able to use ALICE with our hero mammoth assets (models, rigging, skinning, textures), but using the advanced layout and motion blending features of ALICE. Some shots with fewer mammoths on screen could be then done entirely in ALICE. And in case we needed to go further in terms of animation or control over the muscle simulation, we were able to extract those ALICE mammoths to individual hero assets that could be hand tweaked. In effect, Furtility was able to deal with those ALICE mammoths in the exact same way as for the hero ones. We could then use our hero mammoths fur grooms and textures on those ALICE mammoths, exactly as we were doing for models and textures." He added that Alban Orlhiac was head of the texturing department.

Optimizing Render Times

Four different level of details were generated for the mammoths, allowing for faster render times. Other optimization tricks included selectively computing fur dynamics in the crowd, according to distance to camera, visibility in screen and motion. If too many mammoths required dynamics in a shot, the team could reduce some settings on the accuracy of the dynamics, like the number of control points per hair. The difficulty then was to not change the look of the mammoth while reducing the density of fur. "Conceptually, a mammoth that took 1/3 of screen space had 10 hairs per pixel, which translated into 100 hairs per pixel for a mammoth that was 1/30 of screen space," Rocheron says. "So, for a mammoth 10 times smaller, you were able to reduce the fur density by 10 if you increased the width 10 times. Of course, it was just conceptual, but it gave us a base to find the right balance between density and width changes according to the size of a mammoth on screen."

The Giza mammoths featured an extra layer of complexity as they all needed to wear a harness. MPC quickly realized that the net itself needed to be made of fur. 

One of the advantages of writing Furtility was that the team could make sure it was working in combination with Giggle, MPC's scene rendering format. The tool controls how to store geometry in memory depending on what is being rendered. It also enables each element -- mammoths, grass, crowd, environment, particles, etc. -- to have its own render script optimization -- a crowd shot requires a very different optimizations than, say, a single hero creature.

Meanwhile, the Giza mammoths featured an extra -- and major -- layer of complexity as they all needed to wear a harness. Looking at the many close ups in the previs, the team realized that the net itself also needed to be made of fur In order to keep control of the simulation without having to deal with extravagant render times, the CG team cut down the process in three main stages. "First, we simulated a proxy model on the net using cloth simulation to create the general motion and interaction with the net," Rocheron adds. "Then, an actual net model used to grow fur was warped onto this proxy model. Once we had a nice interaction between the furless mammoth and this volume net model, we started working on the interaction between the net and the mammoth fur. We opted for a technique based on occlusion and motion vectors. The occlusion was giving us the distance between the mammoth and the net. Then, we extracted the motion vectors of the sliding net over the skin. Combined with the occlusion, we could define for each hair in which direction the net was going, so we could give a proper direction to the bent hairs. It took quite a bit of time to get the method to be efficient and to produce good results."

In many Giza shots, the harness had the added complexity of having to work on four mammoths at once. The mammoths were walking in fours and each harness was connected to the other three mammoths. Head of rigging department Tom Reed and his team had to develop a rig that allowed the animator to know when a mammoth was going too far from the other and would break the harness rig. For the collision, the same system was used as for the net. Some parts of the net were simulated in SyFlex, while others used rigid body dynamics (i.e., Chains).

MPC's work was made more complicated by the mammoth herd's environment. Every blade of grass captured on the location plates couldn't be rotoscoped so Furtility was used to create more than three million blades of grass.

The Grass Issue

MPC's work was made even more complicated by the nature of the environment in which the mammoth herd appears early in the movie. Plates had been captured at a location entirely covered with high grass moving in the wind. Very nice looking on screen, but nightmarish from a rotoscoping point of view "It was just impossible to rotoscope every single blade of grass to comp our mammoths in," Aithadi comments. "So, we basically reproduced in CG the area in which we had to integrate the mammoth herd. The grass was generated within Furtility to precisely match the actual grass. Many times though, we ended up replacing the whole environment, as it was too difficult to blend the digital grass in the midst of real grass."

On some shots from the mammoth hunt sequence, MPC needed to create more than three million blades of grass. "If you combine that with all the mammoths with their fur that have to cast shadows on it, or that needs to matte out the grass, you are ending up with a gigantic volume of data to render," Rocheron adds. "The Giza sequence also required significant render times. We had a 1,000-frame long shot featuring some 70 mammoths and 15,000 humans on which we were going from long shot to tight shot; it took about 10 days on a couple of hundred CPUs to render! Most of the Giza shots were like that "

The Giza environment was based on the same miniature that was used by Double Negative. However, MPC elected to rebuild it digitally. MPC's Gary Brozenich went to Namibia and shot tens of thousands of digital stills of the miniature, varying angles, exposures and time of the day. Those still photographs were then re-projected on 3D geometries reproducing the environment. It allowed CG characters and mammoths to be integrated behind scaffoldings and rocks that were sometimes no larger than a handful of pixels in long shots. Whenever the Nile River was in frame, the shot was handed over to Double Negative as a final composite in which the CG river could be inserted.

The Wow Factor

At Double Negative and MPC, the work greatly benefited from the unusually long post-production time. The teams were able to concurrently develop fur simulations that definitely represent new references in the field, prompting the assessment: "When you work on a project for so long, you kind of lose track of what you are doing from the viewer's perspective. You only see your work from a technical point of view. And some day, you look back at the shots with an open mind, and you think: 'Wow! Look at what we did!'"

Alain Bielik is the founder and editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications, both print and online, and occasionally to Cinefex. In 2004, he organized a major special effects exhibition at the Musée International de la Miniature in Lyon, France.