Search form

Putting a New Face on 'Mars Needs Moms'

Read how Mars Needs Moms takes performance capture to the next level.

The Kabuki Mask was the big breakthrough in achieving better facial performance. Courtesy of ImageMovers Digital.

Say what you will about performance capture, but Mars Needs Moms (opening tomorrow from Disney) is a definite advancement, particularly for Robert Zemeckis' now defunct ImageMovers Digital studio. Indeed, Zemeckis' style of performance capture (best described by production designer Rick Carter as "portraiture") has come a long way since The Polar Express.

The first thing you notice in this fantasy about a boy who goes to Mars to save his mom and seek redemption for wishing her out of his life is how much better the facial animation is. The skin, the eyes, the mouth, the movement, even taking into account the stylization, are a lot more believable.

That's because IMD went to great pains to improve the facial capture, the rigs, the lighting and the rendering as a result of higher resolution, greater polygonal count and overall accuracy.

For starters, IMD introduced a new system called the Kabuki Mask, which blends the video images of an actor from the four helmet cams and projects it onto a 3D polygonal mask, allowing unprecedented access to the actor's facial performance. Then that gets put onto the body motion or the animation of the character when delivered to director Simon Wells' layout.

Improvements in rigs gave animators more iterative opportunities.

"Bob always wanted to create a 3D polygonal mask of each actor but we didn't have the fire power, we didn't have the software, we didn't have the technology to implement it," suggests production designer Doug Chiang, whose greatest challenge on Mars was delineating the difference between a warm and friendly home on Earth and a cold, gray world on the Red planet."So, Bob had to resort to old school technology of a video clip. It was always enough for him to work with but it was never enough to do accurate editing and eye lines. So, on this one, we actually got that to work."

Animation supervisor Huck Wirtz, who recently launched his own studio, Bayou FX in San Rafael and Louisiana, adds that the greater fidelity is a breakthrough. "The guys working on the MoCap side experimented and determined the best poly count and resolution that we could have," Wirtz suggests. "But we couldn't do more than around four characters in a scene at once. It would bogs things down in projecting the image back onto the polygonal mask."

There's an improved facial rig as well, which has gotten incrementally better since Polar Express. On Mars, the rig is more robust and has greater controls, allowing dramatic improvements around the eyes and mouth. "It's about paying attention to little details," Wirtz continues. "Everyone would always say the eyes are dead. I would look at the animation and the eyes were really doing the right motion, but I noticed that the area around the eyes, including the eyelids, weren't mimicking what happens. So we really worked hard. Eyeballs are still keyframed because you can't track them. We hand animate the eyes and that's one of our first passes. Another benefit of the Kabuki Mask was that it provided perfect reference. We ran mouth and skin in a FACS session through our solver. Beyond that, we have a really good muscle-based rig.

"The fact that we weren't going for photorealism is a big advantage. But it still has to look lifelike. For me, if you're going to do motion capture, you should make it look crazy -- stylize it for whatever the director wants. It's about finding that level and taking everything -- the proportions and textures -- to that same level."

The use of point cloud-based rendering improved skin textures as well as the lighting of environments.

From an animation perspective -- and performance capture is still very much animation intensive since the data is nowhere near dense enough to drive a face right out of the gate -- it's all about fast iteration to hone the best possible look.

But improvements in lighting and rendering were crucial as well. "Christmas Carol was a crazy experience because we had to build a studio and finish a movie in two years, and we had to use brute force technology," suggests Kevin Baillie, the visual effects supervisor and co-founder of the new company, Atomic Fiction. "Mars, on the other hand, was chock full of new technology. We looked at the concept art of Mars and said there's no way we can do this with traditional lighting methods. So we developed entirely new workflows with RenderMan that's heavily point cloud based. We used point clouds for reflections to do all the shiny surfaces; we used point clouds to cast light on the walls in the light and in the ancient underground; and most of the light cast on the characters is coming from these point clouds, even the sub scattering of the skin was point-cloud-based. It actually allowed us to have lower render times. Traditionally, this is extremely expensive to do in PRMan and, again, get that iteration going, so we were using indirect bounce lighting everything and a lot of times it was the primary light source: running down the hallway corridors. In a lot of those shots, it's being lit from the actual strips on the floors or in the walls through point-cloud-based illumination. So it was going from '90s technology to stuff that hasn't been done before on this scale."

In terms of compositing, IMD was solely Nuke-based and so they custom-built a bridge to PRMan, allowing compositors to do more of the reflections, for instance, instead of going back to lighters. Animation was done in Maya; hair and cloth in Maya, Maya nCloth and Disney's proprietary hair and cloth systems. Particle effects were done in Maya, Maya Fluids, Houdini, 3ds Max and FumeFX. Water was done in partnership with Stanford using the latest version of the PhysBAM engine.

The procedurally-based glowing lichen walls also benefited from point clouds.

But to bust out of the cold, gray, oppressive world on Mars, they devised an ancient underground with a colorful lichens on the walls. This turned out to be a challenge as well. "We were designing it as we were executing it as we were lighting it, so it became this massively parallel design phase," adds Baillie.

"One of our look dev leads, Robert Marinic, sat with it for a month. At the end of the day, most of the glowing lichen is procedurally placed: it falls in the cracks and other areas that it would make sense for it to grow; and the color aspect of it was a procedural shader that fell into a point cloud, which also lit the entire scene. So it was this perfect harmony of technology and aesthetics that's truly beautiful.

Finally, IMD fully embraced 3-D on Mars, taking shots earlier into stereo layout, putting eyes on every one before they hit animation. In fact, one of the most stunning moments is a climactic shattering of a helmet achieved through simulation. It took until the last minute to get right, differing from Wells' plan, but it was a happy accident that worked marvelously. "Everybody gasped during the audience screening I was in," recounts Baillie. "That's a great feeling to have an effect that evokes a physical response in people."

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags