Search form

Deconstructing 'Watchmen' -- Part 2

Sony Pictures Imageworks, Intelligent Creatures, MPC Vancouver, CIS Hollywood and Rising Sun Pictures discuss their contributions to Watchmen in the final part of our coverage.

watch201_Watchmen-320.jpg

The most visible vfx challenge on Watchmen was the all-CG Dr. Manhattan. All images courtesy of Warner Bros. Pictures. ™ & © DC Comics. 

Zack Snyder's Watchmen is a far cry from 300. It had to be in order to achieve such gritty realism, with 200 sets built in Vancouver and as much in-camera work as possible before resorting to CG. And yet vfx technology made it more affordable to pull off this physical spectacle, with 1,100 shots (a quarter of which were CG) divided among such vendors as: Sony Pictures Imageworks, Intelligent Creatures, MPC Vancouver, CIS Hollywood and Rising Sun Pictures.

Dr. Manhattan, of course, is an all-CG creation that couldn't have been achieved five years ago. So is Doc's Mars sojourn, with his perfectly constructed Glass Palace. And Ozymandias' Karnak in Antarctica also relied heavily on CG. And Rorschach's inkblot-stained mask demanded a fair amount of animation as well. Plus all of the intricate layering of pop cultural information (particularly in the main title sequence that sets up the history of the masked superheroes from 1939-1985) required some deft CG enhancements, too.

watch202_Watchmen-Vietnam-320.jpg

For the bloody alt-reality Vietnam sequence, Sony Pictures Imageworks had Dr. Manhattan cresting a hill and blowing up Vietcong from the inside. Courtesy of SPI. 

Tricks for Dr. Manhattan, Mars & Vietnam

Not surprisingly, the majority of the vfx revolves around Dr. Manhattan. Thus, with its 400 shots, Imageworks served as the lead vendor, under the supervision of Pete Travers (Click, Zathura). "We did everything associated with Doc," Travers explains, "including the Martian environment, the Glass Palace, the destruction of New York City and [the rapid victory in] Vietnam."

Travers adds that it was "a dense 400 shots." In other words, there were no easy ones. "I came on in the summer of 2007. DJ [DesJardin], the overall visual effects supervisor, approached us and after reading the graphic novel, my first reaction was, 'We gotta do Doc!' DJ already had discussions with Snyder about the approach as a CG character that emits its own light, and I just reinforced that view.

"So a lot of it stems from theories that were successful or unsuccessful. But they wanted an actor in frame that the director could work with to frame the performance and then, with a low footprint on set, we could capture that performance. It quickly threw out any conventional motion capture techniques. The design of the suit had multi purposes: it was a capture suit for Billy [Crudup] with tracking markers and dots on his face, but the main purpose was to serve as a light source. We called it 'above the line lighting', with Billy serving as a blue light for everyone. He was such a strong light source for scenes that [DP] Larry Fong had to be onboard with it. Chris Gilman [founder and president] of Global Effects built the suit based on a lot of the specs that we came up with. Billy was covered head to toe with these LEDs and he wore a helmet and it gave a strong light throw in the environment. We talked about trying to get that in post, but we never would've made it look as good as it turned out using the LEDs because of the hue and hue changes, the way the light reflected and the small pieces of glass going everywhere. The advantage of shooting it in frame was they got all the light for free."

But maximizing the performance and preserving as much as they could in CG was the most significant achievement. Everything was in context with the other actors. During the capture sessions they used up to four Sony F900 witness cameras in conjunction with the high-speed film camera as a tracking source along with Travers' own separate video assist. He admits that it was somewhat similar to ILM's Davy Jones for the Pirates of the Caribbean sequels. "No matter what method you use, animators at a certain point have to dive in and make it look right," Travers maintains. "We had to make sure that Billy and the other actors were not encumbered by us. It was a case of where ever we could find a spot." Then they brought the data back to Sony, matchmoved all of the footage and constructed the scene from that.

Dr. Manhattan had to look photoreal, maximizing the performance and preserving as much as possible in CG. 

They utilized an automated capture technique for the face using multiple cameras to triangulate the scene and convert the data to 3D and would activate a bevy of face shapes that would drive the scene. All of the animation was done in split-screen compared to Crudup's performance. The neck was crucial and driven by a lot of controls to match the actor's movements. But despite the Tron-like suit, Travers says the actor was very committed and cooperative. Then, aside from the neck, the next biggest challenge was animating a ripped, naked body and figuring out how that kind of body behaved. It all had to be re-targeted to Crudup's performance. Fortunately, most of the shots were in closeup with little movement, so they were mainly concerned with matching the position of the CG head with the actor's so that the eye lines worked perfectly.

Also, Dr. Manhattan's entire internal structure was a volume render in keeping with the graphic novel's depiction of the character's interior being made of the cosmos. "We built these internal structures... that would light up the areas inside him," Travers continues. "They would charge up when he experienced a build-up of energy that would shoot through his arm. The internal structure was rendered using the volumetric renderer Svea and the outer skin was rendered in RenderMan." On some occasions, such as the TV studio scene, they had to make his skin look opaque. But they used subsurface scattering through the back of the ears or any of the cartilage areas on his face. Imageworks used Maya for the character animation and texture painted in BodyPaint 3D. Houdini was used for procedural animation.

Imageworks utilized a peak crew of 100. But unlike Beowulf or Monster House or The Polar Express, the advantage here was that they only had to focus on one character, which was not very expressive. However, there was an additional challenge in that Dr. Manhattan had to look photoreal and share plates with real people. Thus, he was treated like a real human being, which had to be worked out during the look development phase. Even the skin textures were borrowed from the actor.

And even Travers admits that Dr. Manhattan's naked body could not be avoided. "He had to be naked because that's what Snyder wanted. He's lost all sense of vanity and is completely apathetic to the human race. We were all giggling as we worked on his private parts, but then, after a while, we got used to it. Certainly, in American society, nudity is so much more taboo than violence, but here's a movie where exploding and tearing and hatcheting people is [talked about less] than whether or not you can see Doc's [blue] penis."

Sony created the Mars Glass Palace, which was built in 3D, procedurally animated in Houdini and rendered in Arnold, which is optimized for ray tracing. 

As for the virtual Mars, Imageworks utilized a lot of the Martian photography from JPL and started with building the Martian sky from very polarized-looking photos and then constructed the landscape from 2D matte paintings all the way to 3D geometry in compliance with the camera targets. The trickiest part was tearing the environment apart when the Glass Palace comes out of the ground or when it's shattered. This involved some pretty intense simulation.

In terms of the Glass Palace, which serves as Dr. Manhattan's perfectly constructed quantum clock, it was easy having two gears intertwining with each other and figuring out the mathematics involved. However, when the pieces come close together, they flip, and that became complicated every time a new ring had to be added. It was built in 3D and procedurally animated in Houdini and rendered in Arnold, which is optimized for ray tracing. Shattering all that glass was a new experience for Sony. It proved a tricky rendering environment because the Glass Palace had to be connected to everything. And light is refracted through the glass along with the dust and debris.

The destruction of New York, meanwhile, went beyond any simulation Imageworks had achieved for the Spider-Man franchise. In tearing everything apart, they literally had to build the internal structures as part of the simulation.

Finally, for the money shot of Dr. Manhattan walking over the hill in Vietnam, which was shot on a bluescreen with a buck of rice patties and actors playing Vietcong soldiers running toward the camera, Crudup was placed on a platform way in the back so they could get the direction right. Imageworks ended up having the Vietcong look like they were blown up from the inside. "We literally built 3D versions of those actors from the inside out: organs, blood, bone, skin, cloth, all individually," Travers explains. "They start out as the actors in the plate and as they're exploding, you're transitioning to their 3D versions of themselves. In fact, it was so gory, that a pared down version had to be made for the trailer."

Intelligent Creatures replaced Rorschach's head through motion tracking, then used a match animation process. Courtesy of Intelligent Creatures. 

Giving Rorschach an Inky Face

Meanwhile, Intelligent Creatures (under the supervision of CEO Lon Molnar), was given the responsibility of creating the dynamic and continuously changing inkblot concealing the face of vigilante hero, Rorschach. This included animating his inkblot movement to reflect the mood and emotion of the actor.

One of Intelligent Creature's first assignments was replacing the character's live-action head through complex cranial and facial-motion tracking, which was seamlessly positioned between the live-action scarf and fedora.

To duplicate the mouth and facial movements produced by the mask when the actor (Jackie Earle Haley) spoke or moved, Intelligent Creatures used a match animation process. The development of an elaborate animation rig driven by the facial tracking markers formed the foundation of this movement.

To keep the pipeline flexible, they created pose-to-pose animation on a 2D plane for an entire sequence, which was then wrapped around the 3D model of the mask. 

Intelligent Creatures' look development team reconstructed the cloth fabric of the practical mask right down to the cross-hatching and hair-like fibers. These details were created through the use of complex shaders (including fur and cloth) and 2D workflows to emulate the rim lighting captured in the live-action plates.

Molnar says it was critical that the inkblot animation pipeline retain flexibility and speed to promptly address any changes required by Snyder and DesJardin. For this task, they enlisted seven classically-trained animators to facilitate the "pencil drawing approach," which was achieved by creating pose-to-pose animation on a 2D plane for an entire sequence, then that surface was wrapped around the 3D model of the mask. This process preserved the continuity of the inkblot within the entire sequence and allowed the lighting team to pre-light each shot prior to applying the inkblot treatment.

Lighting the CG head and mask was also a challenge. In order to sell the mask as a non-CG element, the studio had to re-produce all the shadows created from the source lights by the fedora and other objects. Together with the shadow creation, the off-white fabric produced a curious palette of subtle variations and layering of colors that needed to be captured in every frame, and then blended into the final shot.

Lighting created problems, and shadows from the fedora and other objects had to be reproduced. 

"We had developed a two-stage animation process despite the shortage of motion capture equipment," Molnar explains. "The first stage was the animation of the 3D geometry to match the performance of the actor. The second was the animation of the inkblot patterns in the 2D texture space of the UV layout of the mask geometry. We had advanced our knowledge base by developing techniques such as producing faux 3D facial performance using only one camera for capturing in-camera performance. We augmented the faux 3D data with a keyframe animation technique to improve the quality and details for better effects. After analyzing and attempting to customize off-the-shelf packages such as Flash and Toon Boom, we developed our own 2D animation software to augment Maya for maximum performance and flexibility. Finally, the shapes were trimmed using closed NURBS curves for optimal effects."

Intelligent Creatures also delivered various New York City street environments, set piece extensions, blood, gore, snow and breath throughout the film.

A greenscreen New York City was replaced with work from Moving Picture Co. Courtesy of MPC. 

Bits and Pieces from MPC

MPC's work out of its new Vancouver office consisted of 250 shots, including full digital doubles, digital cityscapes, 3D environments and the CG Owlship.

VFX Supervisor Jess Norman oversaw a team of 35+ artists. They basically used a Maya pipeline augmented by a range of proprietary tools. Shots were composited using Shake and rendered in RenderMan. This integrated approach, with assistance from the London headquarters, allowed the team to achieve their vfx challenges.

MPC's invisible effects begin with the death of the Comedian, containing intense fighting in a high-rise condo. For this, MPC created the completely digital cityscape seen outside the smashing window. The shattering of the glass was simulated by the PAPi physics system developed by the MPC team. There was also removal of corresponding wires and rigs necessitated by the live-action shoot. And digital doubles were added.

During the "Riot Control" flashback dealing with civil unrest in the '70s, MPC modeled and animated the CG Owlship, added digital doubles, CG crowds and enhanced the turbulent environment.

Other work included building on the MPC 3D fire effects library to re-create the rooftop rescue sequence set on top of the burning tenement building and creating a barren snowscape for the climactic Karnak sequences. The team created realistic ice, snow and water effects as well as a digital ocean environment.

MPC also created invisible effects during the Comedian's death sequence, creating the CG cityscape and shattering glass as well as rig and wire removal and digital doubling. 

CG Enhancements Set to Dylan

CIS (with early support from its previous London facility) concentrated primarily on CG enhancements for the six-minute opening title sequence set to Bob Dylan's "The Times They Are A-Changin'." VFX Supervisor Bryan Hirota oversaw 120 shots with a crew of 40. "We did a lot of those shots: Dollar Bill thwarting the bank heist; Night Owl grabbing the guy with the Tommy Gun; the Kennedy inauguration and assassination; the lesbian kiss in New York on VJ Day; the Moon landing," he explains.

CIS did 120 total shots, mostly CG-enhancement on the opening title sequence. Courtesy of CIS. 

"This helped establish the world of the Watchmen. They were all done similarly: live-action actors, minimal amount of set and some CG backgrounds. The Kennedy assassination was staged with a limo and was shot in a parking lot in Vancouver. And we matched that with the original Zapruder film. What's interesting is that Zapruder is actually in the shot. That was fun to do because we got to do a lot of R& D and look up all of these conspiracy sites for stuff to match to, figuring out what cars were there and where, what trees to match to, what was going on in Deeley Plaza that day beyond the Zapruder film, which has a limited view. It's a nice historical Easter Egg hunt.

CIS had a little fun with the Kennedy assassination scene, matching the original Zapruder film from a recreation in a parking lot in Vancouver. 

"We built the environments [using Maya as the primary 3D package] and did the final composites [using Shake and Inferno]. The rendering was done in RenderMan. The shots with Dr. Manhattan, such as the Moon landing, were done by Sony and we would do the final comps.

"We also worked on the archival footage on the monitors during the Karnak sequence. Surprisingly, it ended up being quite a lot of work because what would go where was never set in stone and then they wanted to art direct static and video glitches for all of the monitors as well. It was a big juggling act and a mad dash [to get the licensing rights]."

Hirota also singled out miscellaneous "gore work" as well. This included the assassination attempt of Ozymandias, and these shots were achieved with fluid dynamics and blood simulations.

Rising Sun Pictures' involvement in Watchmen centers on the transition from live-action 1985 Manhattan to the cel-animated

Rising Sun Goes off on a Pirate Adventure

And, finally, Rising Sun handled 31 shots in support of "The Black Freighter" story-within-the story pirate saga. This is a self-contained animated segment produced for the upcoming DVD and Blu-ray release, Watchmen: Tales of the Black Freighter & Under the Hood (Warner Home Video, March 24).

These ins and outs that take place at the newsstand were done using Shake and Nuke to construct pseudo 3D backgrounds of Manhattan streets. "We also applied a CMYK four-color process that recreates Dave Gibbons' original artwork, allowing us to transition from printed comic page to traditional cel animation," explains VFX Supervisor Dennis Jones.

"RSP's involvement in Watchmen centers on the transition from live-action 1985 Manhattan to the cel-animated 'Black Freighter,'" Jones continues. "This involved two distinct types of vfx work; Set extensions and comicbook stylization transitions. Set work involved extending a single street intersection to include the distant streets and urban skyline. The live-action component of RSP's work is crucial in setting up and developing the backstory and introduction to the 'Black Freighter' material. All the shots involve the newsstand and associated characters; the two [Bernies]. 'The Black Freighter' material involves seamless transitions from live-action plates of Bernard reading the comic to full-frame single panels of the artwork. As the camera zooms into these panels they transition from a halftone Gibbons' look to the final clean animated material.

"The two main challenges were continuity and a short time-frame," Jones says. "Primarily for the extended director's DVD, which will contain the film as seen in [theaters] and the 'Black Freighter' material, RSP's shots will become the bridge that links these two separate elements. RSP came onto the show near the end of the main production and had to reference shots from MPC and Sony that had already established much of the geography of Watchmen's Manhattan. The shots had to evolve slowly to bridge lighting and environmental differences between the other vendors; this added a different continuity emphasis and meant every shot had its own quirks to resolve."

Recreating the look and feel of the visceral graphic novel took 1,100 shots, a quarter of which were CG. 

The biggest technical achievement was utilizing Nuke for the first time for complete 3D tracking (it was used on Australia solely for 2.5D background elements). "One of the shots, a street intersection at night, featured a complex steadycam move, foreground characters crossing in front of the camera and greenscreen elements. Nuke was chosen for its ability to construct virtual 3D environments within a 2D platform, thus allowing a 2D artist to control the final layout of scene elements. It was impressive to watch a single artist take a camera track and some matte paintings and create a complete shot within the one application."

Bill Desowitz is senior editor of AWN and VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags