Alain Bielik concludes his two-part report on The Dark Knight with Double Negative, Framestore and Buf.
In comic books, superheroes generally battle one villain at a time. But on the big screen, it seems that two major baddies are a minimum to satisfy audiences. The Dark Knight, now playing from Warner Bros., is no exception as Batman is opposed by the Joker (a haunting performance by the late Heath Ledger) and Harvey Two-Face (played by Aaron Eckhart). Both characters had already made an appearance in the Batman film saga. In 1989, Jack Nicholson portrayed an unforgettable Joker in Batman, wearing a striking make-up designed by Nick Dudman. In 1995, Tommy Lee Jones played Two-Face in the not-so-well received Batman Forever. The actor wore colorful make-up created by Rick Baker.
This time, though, director Chris Nolan wanted a completely fresh approach. The Joker was realized via evocative facial paintwork associated with subtle prosthetics work. For Two-Face, Nolan decided to break new ground. "This character was one of our major vfx challenges," recalls overall Visual Effects Supervisor Nick Davis. "Chris was not interested in going the traditional make-up route. He felt that it would be an additive effect, rather than the subtractive effect that he felt the character required. So, instead of adding a layer of material to the actor's skin, we actually removed the skin digitally. It allowed us to reveal the tendons, the cheeks, the eyeballs and to create unique textures. The challenge here was that we were dealing with one of the main characters, and that the digital make-up would be seen in full close-up, including in dialogue scenes…"
Since the technique was introduced in such movies as Deep Rising and The Mummy, digital prosthetic make-up has become increasingly popular with directors looking for innovative character designs. The task to push it one step further on The Dark Knight was assigned to Framestore in London. In-house VFX Supervisor Tim Webber oversaw the project with VFX Producer Lorna Paterson, CG Supervisor Ben White and 2D Supervisor Jonathan Fawkner. The team used a variety of software, including Maya, XSI, Mudbox and in-house tools for modeling and animation, PRman for rendering, as well as Shake or Nuke for compositing. The all-important tracking and matchmove work was carried out in RealViz Matchmover and Movimento.
The design process started with concept art. "As soon as we agreed on a general direction, we hired sculptors to create full size maquettes that allowed us to study the design in three dimensions from all angles," Davis says. "Once approved, the maquettes were sent to Framestore to be digitized and further refined in 3D."
Framestore's White notes that the key to the Two-Face project was to get enough detail into the CGI to give it realism. "In doing so, we worked at much higher texture resolution than we normally use. We also rendered our CG work at 4K, even for the regular 2K anamorphic shots. A very large number of texture layers were needed, and displacement maps from Mudbox were combined with bump maps and displacement maps painted in Photoshop. It really took a significant amount of work to get it right."
The task of capturing Eckhart's performance was a major consideration, as the crew needed to do it on set as shot by the main unit camera, rather than in a separate motion capture session. After testing, Webber and his team opted for a video-based motion capture technique, shooting at 48 fps from several digital witness cameras positioned on set. Later on, when an individual marker was tracked from each witness camera, the position in 3D space could be triangulated. By doing this for all the markers, the performance could be recreated in 3D.
Two different type of tracking markers were used. "The larger primary markers were retro-reflective material to be visible in low light levels, while smaller secondary markers were applied as black make-up dots," White explains. "Because of the logistics, it was sometimes impossible to get many witness cameras in place -- for example, when the action was taking place in the back seat of a car. This made the job of tracking and positioning our CG face even harder. Despite this, there were several parts that needed to be animated by hand. It included the subtle muscle motions when tensing up, and also the animation of the CG eyeball."
While part of the team was handling Two-Face, another part was hard at work on digital environments and CG doubles. Much of this work focused on a sequence set in Hong Kong, where Batman jumps from the very top of a building and glides though the air using his cape. In addition to a CG Batman, Framestore needed to create two full CG skyscrapers that would be seen at IMAX resolution and from close up. "This presented a particular challenge, as the room interiors would be seen from such range that the use of 2D texture cards behind windows would not suffice," White says. "Instead, we used photogrammetry techniques to help model and texture the room interiors. We then created a shader toolkit, allowing us to control the position of each office within the building and adjust the lighting, making sure there was enough variation to create a believable effect. Tiled vista plates of the surrounding city at night were shot from helicopter, and these were stitched together to make a moving panorama."
The sequence ended up as a combination of partial and full replacements of the main buildings and surrounding city, with several shots fully digital.
Visualizing the Invisible
Meanwhile, across the English Channel, Buf Compagnie had been called in, once again, to create highly stylized visual effects. On Batman Begins, Pierre Buffin's company had produced all the hallucinatory effects. This time, the Parisian team was assigned an even more intriguing concept: the "Spy Vision." In the movie, Bruce Wayne develops a new technology that is able to detect the electromagnetic waves emitted by cell phones. By scanning the waves' behavior in space, the device basically allows Wayne to visualize in three dimensions any environment in which a cell phone is turned on.
Since nothing like this exists in the real world, it was up to Pierre Buffin, Visual Effects Supervisor Dominique Vidal and their team to figure out what this technology could look like. "We started with the concept of the sonar: a device generates a wave that reveals all the volumes around it," Vidal remarks. "Except that you can't see a sonar wave. So, we did tons and tons of tests. We tried waves, metaballs, smoke, particles, etc. This process, partly supervised by Xavier Bec's research and development team, lasted eight months. In comparison, the actual creation of the shots took four months…
"Chris Nolan wanted the waves to bounce back on obstacles, but also to partly travel through them. In addition, the device needed to feel harmless, and not look like X-Ray. "We tried so many things, like how fast should the wave go, how much of it should bounce back, how fast it should dissipate, how it would dissipate, etc.," Vidal continues. "In the end, we managed to find the right combination of wave frequency, speed and rhythm. It was basically a CG wave on which we added some noise. All our work on the movie was created using proprietary software."
Working with VFX Producer Alain Lalanne, Buffin, Vidal and Nicolas Chevalier also designed the environments in which the wave animation was going to take place. The Spy Vision plays a key role in the Prewitt building sequence, in which Batman uses it to locate hostages. The device is also featured in the Batcave monitor room, where dozens of monitors visualize different locations -- an environment reminiscent of the Architect Room that Buf had created for The Matrix Reloaded.
The first step was to build each environment. Most of the time, it meant modeling layers upon layers of geometries, as, in the Spy Vision, all volumes are translucent except when hit by a wave. "We had to see across the rooms all the way through, including the city in the background with traffic in the streets, etc." Vidal remarks. "It was amazingly complex. And we had to do it for each one of the Spy Vision scenes in the monitor room. It was a huge endeavor for something that was only meant to be part of the background. In order to get enough material, we filmed our families, our apartments, our offices, etc. using video capture technology to have references for the key frame animations. The amount of work that those monitor room shots required was somehow insane!"
Buf also created the Batman logo animation that opens the movie in spectacular fashion.
Pioneering IMAX VFX
Just like Double Negative and Framestore, Buf had to deal with the fact that some of its shots were going to be rendered at IMAX resolution. For the three vendors, this unique requirement added an enormous challenge to an already very challenging project. Visual effects are usually created at a 2K resolution, very rarely at 4K, but full IMAX resolution exceeds 8K x 6K, with a single uncompressed frame representing around 200 Mb of data…
Double Negative produced the largest number of IMAX shots. "Early in the project, we took our crew along to our local IMAX screen to see The Dark Knight bank heist prologue in order to give everyone an idea of just what IMAX footage looks like," recalls Double Negative 2D Supervisor Andy Lockley. "Most people had seen upscaled IMAX films, such as Harry Potter and the Order of the Phoenix, but few had seen actual sourced IMAX footage. And when the first shot came onto the screen, there was an audible intake of breath… and a few worried faces in the audience."
At Double Negative, Jeff Clifford and the R&D department compiled a 64-bit compositing system with a streamlined set of tools, which allowed the team to access the extra memory that Shake required to be able handle the much heavier frames. "The amount of detail in the IMAX scans was frightening," Lockley continues. "You could see everything! It was like working on nine 2K shots tiled together. Looking on a monitor, you could keep zooming in, and more and more detail would reveal itself, so we really had to make sure that we didn't miss anything in the composites. A little edge that might seem insignificant on the monitor could potentially end up being five meters long (18 feet) when projected on an IMAX screen! After a series of tests, we determined that for most of the shots, we could work at 5.6K and save the 8K for a few specific moments that would benefit from the extra resolution. At 5.6K, an average .exr file would have a size of 80 Mb per frame. It made the compositing scripts very heavy, sometimes working with 30 to 40 passes of CG all at 5.6K, but I think it was worth the extra pain."
Since there wasn't any facility to play back 5.6K frames in real time, each shot was split into 2K tiles to be checked with realtime playback on Double Negative's FilmLight system.
Upgrading on All Fronts
To handle the IMAX format, Double Negative increased its render farm from several hundred to over 2,000 CPUs, and added significant amounts of high speed, high bandwidth storage. It also made key improvements to its pipeline, included comprehensive integration of all assets into an easily accessible published database. The team developed a new lighting toolset that accurately modeled a huge range of real-world lighting fixtures and illumination types. This was then combined with a point cloud illumination baking pipeline, which resulted in detailed lighting setups that could be managed and rendered with great efficiency. Another new important tool was dnSpangle, a lighting preview real time hardware renderer that is tightly integrated with proprietary Rex/PrMan render pipe. Combined with the point cloud illumination tools, this enabled the artists to get results very quickly without having to hit the render farm with requests for preview renders.
"Once we had committed to building this setup, we concluded that it would be just as easy to do all the standard 2.35:1 work on the show at 4K rather than 2K," Franklin observes. "We eventually delivered around 370 finished shots for The Dark Knight, but the data created was sufficient for a show with well over 3,000 shots at standard 2K resolution."
Framestore also treated IMAX resolution as a combination of 5.6 or 8K, depending on the shot. 3D renders and 2D elements were created at one or the other of these resolutions. On the other hand, Buf worked at either 5.6K or at 4K anamorphic: "It is much heavier to handle, but in the end, you really obtain a quality that clearly sets the movie apart from anything that one can see on a DVD at home," Vidal notes.
PacTitle Digital and Cinesite also contributed extensively to the film's visual effects, but were not involved in the IMAX effort.
The Ultimate Challenge
The great paradox of the movie is that most of the visual effects work is seamless, but those shots will be shown in greater detail than any visual effects work in mainstream film history! "I'm very proud of the work that we did," Davis concludes. "A lot of people won't even know what we did. All the houses did a fantastic job, and they really took on the IMAX challenge with great enthusiasm."
Alain Bielik is the founder and editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications, both print and online, and occasionally to Cinefex. In 2004, he organized a major special effects exhibition at the Musée International de la Miniature in Lyon, France.