Search form

Getting Bullish on 'Knight and Day'

No talking animals for Rhythm & Hues, but the running of the bulls posed a different kind of challenge on the latest Tom Cruise actioner.

Check out the Knight and Day trailer and clips at AWNtv!

R&H's photoreal animation had to match the plates for this thrilling sequence. All images courtesy of Twentieth Century Fox.

For Knight and Day, the lighthearted espionage thriller starring Tom Cruise and Cameron Diaz, director James Mangold naturally wanted real-looking bulls for the Spanish bull-run stampede. For Rhythm & Hues, it was a wonderful break from the usual talking animals.

"We had CG trains and cars and set extensions and greenscreen work and got to blow up a plane, but the biggest challenge were the CG bulls," suggests Greg Steele, the visual effects supervisor of Rhythm & Hues. "We went to Spain, blocked off a street and shot some footage with some running bulls in it and we had to augment that and increase the number of bulls. And actually some shots didn't have bulls, so we had to add them. There were usually nine to 15 bulls in each shot.

"It was actually nice not to do a talking animal for once. And that was Jim's really big thing: he wanted it to look realistic: the bio mechanics of how it moves and the weight of it."

The bulls were dangerous and uncontrollable to motion capture, but R&H shot as much reference footage as it needed to send back to the animators in LA. "We went out to the ranch where they were keeping all the bulls for this running stuff because we did have a couple of shots where we actually did put the bulls in there and Tom [Cruise] rode with them," Steele continues. "It was crazy and I can't believe that he did it.

R&H built the bulls from the inside out to get to the core of their motion.

"At the ranch, we set up a situation where we shot with three HD cameras and surveyed the place: a bull charging at someone, stopping and turning, the bulls skittering and bumping into each other. Then we also had an area where they ran along some fences to get some good biomechanical locomotion footage. We took some select pieces and tracked all of those and had some animators go in and rotomate to it. And that was important because it wasn't just a matter of replicating what the skin was doing; we had to also replicate what the bones were doing… and had to get to the core of what their motion was. So we had the animators go in and work the bone animation because we had simulations on top of all of this -- we built it from the inside out so the skin would simulate and would have all the right harmonics to the muscles as they moved through space and had the right gravity."

Steele says it was a fairly quick job: R&H had five months but didn't get most of the footage until mid-way through, which gave the studio about two months to build all of its assets.

"We started dropping these vignettes in and mixing things up," Steele adds. "Once the shot was locked in the way we wanted, we would go in and add nuance. For instance, two of the bulls are bumping into each other and we added some head rotations. We didn't want it to look too mechanical. The reference was great for all the texture detail and fur grooming that we needed to do."

Indeed, the lighting of the bulls proved challenging as well since the fur was so reflective. That meant that they couldn't rely on the usual specular hit. "It had to be a reflection that we could adjust with the HDRIs we had shot on location," Steele says. "Thankfully, our rendering guys and Josh Breyer, one of our CG supes for lighting and rendering, took it upon themselves to find a way to render the fur as a ray traced methodology, which made it look a lot more realistic and it dropped right in next the real stuff perfectly."

Because the fur was so reflective, R&H built a ray tracing technique.

R&H built the original maquette of the model of the bull in ZBrush, and took it into Maya and cleaned it up and set up "a little pipeline because we wanted a super high-resolution model that we could use to run the simulation of the skin on. We were more than happy to use a 250,000 poly mesh for the final simulation that we were doing to get all the detail in the skin."

For the rest of the pipeline, R&H relied on its proprietary software: Voodoo for animation and Wren for rendering. And it could add in the new ray tracing technique to make it work better for them.

Meanwhile, other vfx work was turned in by Weta Digital, Soho VFX, Hydraulx and Wildfire VFX. Eric Durst served as overall visual effects supervisor.

Weta, for instance, created a snowy alpine environment to be composited outside train windows. The two scenes were both greenscreen shoots, one in the dining car, and the other in the kitchen carriage. The New Zealand studio completed 139 shots in five weeks.

R&H used ZBrush, Maya, Voodoo and Wren.

"Normally, faced with a scene like this at Weta, we'd immediately set about generating a full digital environment, with snowy terrain, trees, villages, mountains at the horizon and a sky dome," explains Charlie Tait, Weta's visual effects supervisor. "However, we were sent reference footage shot from inside a train travelling through the Swiss Alps, and we noticed pretty quickly that there were periods in this journey that could easily be replicated using a painted background, and 2D trees on cards passing the train. We had a tight turn-around, so we started with this approach, to get the scenes underway."

And the approach worked. Weta had the camera department do a layout setup, in which the train's speed would be determined by a single axis node in Nuke (allowing Weta to adjust it at any time), and the cameras were tracked quickly in most cases just for their rotations. "One of our compositors, Jean-Luc Azzis, made a Nuke gizmo, which was used to place tree cards in groups parallel to the axis the train was travelling along," Tait continues. "The gizmo took a single directory full of tree images as its input, and randomly selected from them to place up to 30 trees per node along the tracks, with their position and spacing adjustable by the compositors. We found that by using many of these nodes, a compositor could quickly spread around 400 trees along the track, for any given shot. Although the trees individually were 2D images, spacing them out away from the train, and allowing sufficient spacing between them to allow the camera to see far back through them, gave a great sense of space and depth."

Bill Desowitz is senior editor of AWN & VFXWorld.

randomness