Search form

'Superman Returns': The Passion of the VFX — Part 2

In the final part of our Superman Returns coverage, Bill Desowitz reports on the challenges and accomplishments met by several participating vfx studios.

For the 1,400 final vfx shots in Superman Returns, the work was divided among 11 companies, including The Orphanage, which created the stunning eye shot sequence above. All images courtesy of Warner Bros. Pictures. 

For the 1,400 final vfx shots in Superman Returns, the work was divided among 11 companies, including The Orphanage, which created the stunning eye shot sequence above. All images courtesy of Warner Bros. Pictures. 

Superman Returns is certainly different from the X-Men, admits supervisor Mark Stetson, who oversaw 1,500 vfx shots (if you include the deleted opening sequence on the dying planet Krypton), reduced to 1,400 for final release. There is a lot of respect for the Superman character from all of the filmmakers. Theres a lot of story and hes deeply enmeshed in U.S. and world culture. I think we all worked within that context of iconography. We didnt want to mess with him too much and deviate from the storyline. It was more like taking the cultural context and bringing the story forward to 2006.

Its not surprising, then, that a franchise reboot of this magnitude would be divided among 11 companies (including Sony Pictures Imageworks, Rhythm & Hues, Framestore CFC, Frantic Films, The Orphanage, Rising Sun Pictures, Photon, Pixel Liberation Front and Eden FX). But what was unusual, according to Stetson, was the level of cooperation amid the challenges and pressures. He partly credits the first-time use of the Genesis digital camera. With the Genesis as a digital hub, courtesy of Scott Anderson and his company, Digital Sandbox, all the vendors were thrown into new territory together.

That set us from the beginning in being collaborative and not competitive, Stetson contends. The cameras terrific. It gives beautifully crisp and sharp images. It was designed to be an effects camera, to pull mattes very cleanly so the edges are very sharp. And so what happens is you do your first greenscreen and the edges are so sharp you think its wrong. It looks like a bad pull and you want to soften the edge. The difference is that film has a natural roll off between one color value and the next. And the Genesis does not. One pixel is one color and the next pixel is bluescreen or greenscreen. Its very precise, which is great for hair.

We had to create our own post-production pipeline for getting shots in and out and getting them converted to Genesis native format files that the vendors could use, creating standards for color space amongst the various facilities and figuring out how to check shots on the way in because we werent filming out. We worked really hard with Digital Sandbox, which contributed to that process and provided all the technology needed to conform as well as the shot IO. Tom Siegel, our cinematographer, was very aggressive in using the camera and capitalizing on its strengths. He tested the camera and figured out what its capabilities were and pushed them. You look at it and say, Nice movie. Its a viable camera system and there are about a dozen movies using it. Its not a magic bullet. It has its limitations, just as film cameras do. In some cases, weve seen the limitations and have to go to film for high-speed work and underwater housing quickly because you had to come up with a cooling system for it, but we did figure out how to use it for motion control speed shooting as well as cable flying all over the place. We tortured that camera.

Chris Battys work on the Listening Post sequence for PLF pushes the use of previs in terms of the look and feel and emotional resonance. 

Chris Battys work on the Listening Post sequence for PLF pushes the use of previs in terms of the look and feel and emotional resonance. 

Previs with PLF and RSP

Veteran previs firm Pixel Liberation Front (mostly an XSI house), began working with Singer on Logans Run and switched over to Superman Returns when that project heated up. PLF then collaborated in Australia with Rising Sun Pictures on previs. The difference in the PLF previs work on Superman Returns is quality as well as look and feel, according to lead previs artist Kyle Robinson. We didnt just do visual effects shots but also the filler shots around them (including some dialog), so youd have complete sequences from head to tail. We were brought in September 2004. I worked with Chris Batty, the first lead. When the team moved to Australia, they supervised down there. At one point we had 20 previs artists and worked on 17-20 sequences, most lasting more than five minutes: everything from dialog, to setup to action sequences, to post action sequences. We were prevising up until November 2005.

We were fortunate enough to get the cyber-scan of Brandon [Routh] from Sony Imageworks, so he was our main character that we got to use, and he brought a level of realism to our previs. The challenges of this were that with the new technology and cinematography styles, we were trying to tell a classic story of Superman today. How he moves, how he flies and to try to maintain that persona without making it too gimmicky with camera moves.

One previs sequence that stands out for Robinson is Listening Post, in which Superman floats above the Earth, which Batty worked on solo and was composited in After Effects. Its very emotional and one of the more beautiful sequences, which pushed the look and feel. There lots of clouds and filters put on. This pushes previs in that its being used to tell the story and show the cast and crew the vision of the director. And it helps the collaborative effort in bringing the crew together early, so thats easier on the budget. It also helps with the acting. Instead of having stiff Gumbys walking around, they are actually walking and gesturing and nodding their heads and reacting to each others dialog, moving into the action and visual effects sequences. Bryan was reintroducing the character of Superman to a generation of filmgoers that doesnt know him. Its a continuation of the first two Richard Donner movies, but with more of an emotional attachment and [an exploration of] how life changes.

The Bank Robbery sequence, which had been cut early on, was reinstated with the help of PLFs previs. 

The Bank Robbery sequence, which had been cut early on, was reinstated with the help of PLFs previs. 

One sequence that was originally cut during budget planning but later reinstated with the help of more previs firepower, was the Bank Robbery (with vfx supplied by The Orphanage). Because this was a showpiece, the Bank Robbery contained minimal production design groundwork, so the previs team had to guess sets, gun design, city layout. Thus, even though the final sequence turned out differently, the previs provided a solid cinematic foundation for the movie, according to PLF lead previs artist Colin Green, and serves as a further instance of previs as powerful sales tool.

Meanwhile, according to RSPs previs lead, Leo Baker, Each day, Mark [Stetson] would look at the previs cut and would come around to each of our desks working directly with us, briefing us on sequences and discussing work in progress. Bryan would have the updated sequences presented to him generally around once a week, or more regularly if his feedback was required. Bryan was really particular with getting the previs quite finessed, as an accurate representation of the shooting plan. This extended to such details as temp music, and sound effects adding to the drama of the building suspense in a scene. The sequence previs process became its own microcosm of the actual film. A finished previs sequence would clearly outline Bryans intentions, so his ideas can be communicated effectively to the studio and subsequently the rest of the shooting crew.

Working with the PLF was cool. Previs work is their main focus, so they were able to give us guidelines for how much attention to detail is required, and what areas of the process are the most important: camera details, and placing the camera at practical shooting positions.

Discussing sequence shot coverage was really quite inspiring. Its like youre putting on the dps hat and running with your ideas for covering the action of a scene. All the work we did was achieved within a very fast turnaround, so we learnt lots of little tips and tricks for achieving good looking previs quickly. This included fast animation techniques, both manual and using animation presets, and also rendering techniques. Ninety-nine percent of the work we did was achieved with openGL texturing and capturing. Some of the assets had to be modelled and textured. We learnt a lot about how to get the most out of the realtime openGL texturing. Some of the assets required preparation for the openGL texturing. This often meant ambient occlusion or even global illumination calculations being baked into texture maps (using mental ray). The PLF guys had developed some quite effective realtime shaders for water and particles. These elements were used throughout many of the sequences, so it made a big difference that they looked good, and could be captured immediately, there was no need to render using any other render engine.

For Rising Sun, The Kent Farm flashback sequence required the use of almost every technique in the book to complete. Every shot needed digital work and creating a digital double of young Clark Kent was also a challenge.

For Rising Sun, The Kent Farm flashback sequence required the use of almost every technique in the book to complete. Every shot needed digital work and creating a digital double of young Clark Kent was also a challenge.

RSP Returns to the Kent Farm

Tom Crosbie, who supervised visual effects for Rising Sun, says they focused on the Kent Farm flashback, in which Clark Kent recalls his first flights of fancy, and came up with the concept and design for the X-Ray vision. The Kent Farm flashback sequence required the use of almost every technique in the book to complete. Every shot needed digital work to some degree, ranging from fairly complex rig removals with optical flow based retiming, to complete digital rebuilding for some of the shots. We built these from a combination of the live action with extensive plates and pulled these all together from the survey data wed captured. There was also the challenge of creating a digital double of young Clark Kent.

To achieve an X-Ray vision look was equally as challenging but in a different way. On the farm, we had the target of making it real so we knew what we had to match to. But to visually realize a concept, which everyone has his or her own ideas about, required a lot of thought. Our final delivered shot count was 106 and at our peak we had 30 artists with 10 production and support staff making sure that everything was running smoothly. We ended up working on approximately 7.5 minutes of the film, which comes out at around 11,500 frames.

David Scott, Rising Suns art director, produced some fantastic concept work for Supermans X-Ray vision, which convinced director Bryan Singer to use the effect. 

David Scott, Rising Suns art director, produced some fantastic concept work for Supermans X-Ray vision, which convinced director Bryan Singer to use the effect. 

For Kent farm, we made extensive use of HDRI and standard image-based techniques for young Clark. Stephan Bender, who plays the character, spent time with us getting a full body and head scan as well as standing in a hot studio, while we took many hundreds of digital stills to ensure we could get his clothing, hair, skin perfect.

Initially, this was only going to be needed for some of the wider shots, but it became obvious early on that the double would have to work at the highest resolution. Its definitely not a trivial process pulling all these elements together. Dan Kripac (senior td) continued to improve the textures and rigging, as well as the cloth pipeline throughout much of the production as we progressed to higher resolution renders. The cornfields we created using a combination of techniques. Pasting a series of bluescreen corn stalks onto simple 3D cards and then seeding the mid and distant fields within Maya (using the set surveys to match to the terrain) was one of the simpler ones, but in some of the shots we needed to create fully digital corn to ensure that young Clarks wake had the right dynamics within the shot. Colin Doncaster (R&D, senior td) did a fantastic job creating the Houdini pipeline for this.

A corn digital asset built the leaves, stalk, flowers and any other elements that went into describing one stalk of corn. This was then fed a spine that the corn was grown on. Any number of spines could be animated and then fed into the digital asset to create the animated corn rows.

Rising Sun had to create a process that ensured that all the X-Ray vision sequences had a level of continuity, including this look into Lois body. 

Rising Sun had to create a process that ensured that all the X-Ray vision sequences had a level of continuity, including this look into Lois body. 

On the X-Ray side of things, David Scott, our art director, stepped up and produced some fantastic concept work for Supermans X-Ray vision. There was a moment in there when Bryan wasnt sure whether using an X-Ray effect was necessary, but as soon as he saw Davids work he loved it and we were given the green light. David spent time working with DanBethel (R&D, senior td) so that Dan knew exactly what tools to build to manipulate and cut through the volumetric shaders we knew we had to write. We then went through the process of closely working with Mark Stetson and Bryan Singer to transform Davids initial concepts into some really beautiful shots. One of the trickier parts of this process was ensuring that there was a level of continuity between each of the X-Ray looks. Two of the shots involved cutting through hard solid objects to reveal an interior whilst the third had to be more organic because were looking inside a human body.

Other tools used by RSP included Shake, boujou, 3Delight (a RenderMan renderer), Hype, Liquid, The Foundrys Kronos for re-timing, Liquid (originally written by Colin Doncaster for The Lord of the Rings: The Fellowship of the Rings) to export the ribs from Maya to 3Delight.

Sony Imageworks built volumetric clouds, sky domes, skies, stars and a ground layer of ocean and the peninsula for the Shuttle Disaster environments.

Sony Imageworks built volumetric clouds, sky domes, skies, stars and a ground layer of ocean and the peninsula for the Shuttle Disaster environments.

Imageworks Conjures Metropolis and Daily Planet

In addition to the creation of the digital Superman, the Sony Pictures Imageworks team created the intricate city of Metropolis which included not only background and foreground buildings, but also the iconic Daily Planet building and Metropolis Park. All told, Imageworks did 317 vfx shots, including the digital double flying animation (described in part one).

Our principal environments were Metropolis and the sky and the landscape for the Shuttle Disaster, explains visual effects supervisor Richard Hoover. Metropolis was used for dialog between Superman and Lois for their flying scene. For Shuttle Disaster environments, we built volumetric clouds for all that flythrough that were all CG, sky domes and skies and stars and a ground layer of ocean and the peninsula where the stadium was. All that was a combination of digital matte painting environments. The stadium was shot at Dodger Stadium and then altered to look different. Most of that was done with rear projections done onto LIDAR scan at the stadium so we could do the camera moves we wanted. A few of the shots were plates. Plane and shuttle were all CG. We did research online. All the wings were constructed with actual structural support inside and all the sheet metal was like the real plane so that when he went through it and it broke apart it would be where the natural seams would be. We also had some great footage of Boeing stress tests on the wing where they caved. We used that as reference. They bend the wing until it almost goes 90 degrees before it breaks.

For Metropolis, some of the buildings were from Spider-Man, although the textures and colors were altered. Some of the buildings we built from scratch, namely The Daily Planet. The art department gave us rhyno models of The Daily Planet and then we designed the layout of the city blocks around The Daily Planet with the park in the middle being an extension of the grassy area and where Stage 1 is on the Fox lot in Sydney, because the ground floor and the entry way of The Daily Planet was built outside on the lot, covering up the front façade of two of the stages, the doorway being positioned between two stages.

The other principal environment built by Sony Imageworks was Metropolis, seen here in the background during a scene between Superman and Lois.

The other principal environment built by Sony Imageworks was Metropolis, seen here in the background during a scene between Superman and Lois.

So we started building what The Daily Planet actually looks like by extending that location. We positioned buildings we had from Spider-Man all around the park. And then on both sides of The Daily Planet we took buildings and made them taller and changed their textures so the front street façade of The Daily Planet was unique. And we worked on layout. We looked at maps of New York. The art department modified Manhattan Island: the shorelines and where the street grids were and where the bridges were and we found a good place for several city blocks surrounding The Daily Planet and adhering to the grid of Manhattan. It allowed us to change the streets. We altered the Brooklyn shore to be where Lois lived and we altered the Jersey shore somewhat to be where Lexs mansion is. And then took out Staten Island and Long Island to open up the end of Metropolis to the ocean.

Houdini was used to develop a wide range of effects, including fire, smoke and dust in the Shuttle and Metropolis Disaster sequences, Supermans flight into orbit and Superman going for a final spin around the globe.

Rhythm & Hues Performs Sea Rescue and Brando Recreation

Rhythm & Hues was tasked with creating 120 vfx shots for the Sea Rescue involving Lois Lane and her family and Supermans Fortress of Solitude. The Sea Rescue entailed giant crystal growth; extreme water simulations (volumetric simulations of foam particles, waves, swirling whitewater and ocean mist); digital stunt doubles, including Superman; matte paintings; and environments using painted and scenic textures. Plus Gertrude, the CG glass bottom yacht impaled by crystal.

The Fortress of Solitude work comprised interior digital set extensions with translucency to augment production-built set, where Lex Luthor (Kevin Spacey) discovers some of Supermans secrets, including the mysterious reappearance of Supermans father, Jor-El, played by Marlon Brando. Footage from Brandos original 1978 performance was repurposed using a combination of 3D modeling, facial animation and textures from both partial projection and those generated by mouth phoneme shapes.

For the Fortress of Solitude sequence, Rhythm & Hues built interior digital set extensions with translucency to augment the practical set. Original footage of Marlon Brandos performance was repurposed. 

For the Fortress of Solitude sequence, Rhythm & Hues built interior digital set extensions with translucency to augment the practical set. Original footage of Marlon Brandos performance was repurposed. 

We took our proprietary tools and pushed them further, suggests visual effects supervisor Derek Spears. The open ocean was done with WaveTools, which is for simulating coherent surfaces or what they call ambient waves. They have the ability to generate foam on top of that for whatever wave crest and thats automatically generated as well. Also, in the vicinity of objects that cause white water, we mapped various elements that we shot in practical form, elements along the base of the crystals or the yacht per se. And then we created green water splashes up against an object and bubbles that are trapped beneath the surface. WaveTools got pushed to a higher level of realism.

FELT is a new tool. Its a vector description language that allows us to describe very complex systems of motion and simplify them and add high-level detail on top of that. For instance, when waves are crashing against crystals or the boat, sometimes we used FELT instead of practical elements or wed use a combination to render the hero elements in FELT and support that with 2D elements underneath.

The Gertrude model breaking in half was all-digital. We used Rigid Body Simulation and particle debris for high-level detail. Houdini was used primarily in the Sea Rescue sequence. It was used entirely for animation and rendered in mantra. Our water toolkit is based on using Houdini. Ahab is the in-house fluid simulator.

Spears says the Fortress of Solitude was an artistic as well as a technical challenge, rendered through the internal toolkit called Wren. We collected a bunch of scans from Warner Bros. of the original Marlon Brando footage and then we began to use that very similarly to our talking animal pipeline. We would take select footage similar to the performance that we wanted, track that, project the texture back onto that from the original Brando footage and then go back and reanimate the mouth to match the performance that editorial selected. Its important that we select footage that was very similar to the facial performance work because we couldnt use anything outside the mouth. And sometimes we had matching audio for what Brando would say and other times we had to completely come up with the animation based on how we think Brando would react. This allowed us to review from any of the three directions or add completely new dialog as the scene allowed us to. We had to create a lot of additional texture based on his original facial texture to fill in the gaps. It was also modeled by Maya. The pipeline was a refinement. It was a difficult challenge because we were dealing with a famous actor and we wanted to capture the essence of his performance.

Frantic Films created the crystal growth pipeline, integrated by the late Richard Baily's Spore effects into 3D crystal geometry.

Frantic Films created the crystal growth pipeline, integrated by the late Richard Baily's Spore effects into 3D crystal geometry.

Frantic Creates Crystal Growth

Frantic Films, under the supervision of Chris Bond, created the crystal growth pipeline, integrated by the late Richard Bailys Spore effects into 3D crystal geometry. Frantic also created the explosion of the Red Sun at the beginning of the film, the undersea birth of Luthors evil island with the growing crystals and the deleted Return to Krypton opening. Overall, Frantic was responsible for 140 vfx shots.

Frantic used a lot of tools, including 3ds Max, Splutterfishs Brazil rendering system, mental ray (ocean surface) and eyeon Digital Fusion (compositing). In addition, Frantic used the following proprietary software:

  • Flood: Fluid dynamics, silt dynamics, sun explosion, surf, ocean surface dynamics

  • Krakatoa: Particle rendering for silt, sun explosion elements etc.

  • CMS: Crystal management system (developed around Bailys Spore work)

  • Deadline: Renderfarm management

  • Project Flow: Asset and project management

We started on Superman with McG and Boyd Shermis in August of 2003 doing previs for an entirely different script, Bond recalls. During that period, we had done tests for kryptonite, which we showed to Mark and Bryan, and received the contract for R&D for Superman Returns in December of 2004. We spent approximately five months during look development, shader writing to nail out looks for the crystal ship, the Fortress of Solitude, how crystal grows, the look of spore and we were awarded a number of contracts for final shots, as well as development through our software side to support facilities working with crystal.

We ported our crystal software (CMS, crystal management software) to Linux and Maya (originally written in Windows and 3ds Max) this tool enabled the user to place spore elements in their application, warp distort and align them, so that they could be refracted through the crystal. In addition, we wrote shaders for the crystal, including virtual boolean shaders to hide/unhide interior geometry, and created a toolset to assist with crystal growth.

We wrote a lot of software to do this project, and the biggest expansion to our toolset was likely the ability to generate, simulate and track and render billions of particles per frame for fluid simulation, particulate, debris, smoke.

The Orphanage used Houdini for the Bank Robbery sequence, where Superman thwarts the bank robbers by blocking the bullets from a mini-gun. 

The Orphanage used Houdini for the Bank Robbery sequence, where Superman thwarts the bank robbers by blocking the bullets from a mini-gun. 

In terms of collaboration, we have an office in Sydney that we opened to assist with local collaboration while shooting Superman Returns, in addition we used cineSync by Rising Sun Research [the tool of choice for all vendor reviews] to synchronize QuickTimes and have production meetings to evaluate our look. Once in Los Angeles, we spent a lot of time at the reviewing of shots projected with Bryan and Mark. For the [deleted] opening sequence [on planet Krypton], we had approximately 4.5 weeks to deliver nearly 2,000 frames of full CG from concept to completion that required a lot of Bryan and Marks time in meetings to review our elements and assets, and was likely the most challenging aspect of the project for us in terms of rendering, creative and tight turnaround. When the sun explodes there are billions upon billions of rendered particles for every frame.

A Shot to the Eye From The Orphanage

The Orphanage created a total of 144 vfx shots supervised by Jonathan Rothbart. The company developed fluid sim, digital fire, smoke and muzzle flashes. Two of the shots are in the trailer, one of Superman being shot directly in the eye and one of him being shot multiple times from a Gatling gun on the rooftop.

The Orphanage used Houdini for the Bank Robbery sequence where Superman thwarts the bank robbers by blocking the bullets from a mini-gun and a crowd shot used in the pivotal Metropolis Hospital scene.

For the Bank Robbery, the rooftop set, actors and Gatling gun, primarily were live-action elements. The team used Houdini to add tracer fire, sparks and bullets hitting the ground and generating debris. Muzzle flashes and shell casings were also done in Houdini.

The other shot, where the camera pans up over the crowd on the street in front of Metropolis Hospital, required the team to create a full CG crowd extension. The plate consisted of roughly 300 people in the foreground, used Houdini to integrate more than 1,000 fully CG people mingling with the live-action crowd.

Digital smoke was a big development for us, says Rothbart. The Bank [Robbery] sequence was a real challenge. We do a number shots at real speed and but when it switches to slow-mo, our smoke and muzzle had to be timed down by 300%, so what was normally a single frame had to be stretched out to 20 to 30 frames. That required a whole other level of detail. So we wound up doing it through fluid dynamics. Just to work out the sims took two to three days. And wed do five different versions and put them on five different sim machines, and two days later wed look at them all and come up with the best. The flame went through comprehensive R&D three months before production. The muzzle flash was one level and we did that in a very specific way with Houdini for geometry and textures, whereas the slow-mo was done with proprietary fluid simulation. And then the smoke was done with proprietary fluid simulation. The most difficult part about the smoke was getting it at the right speed and then maintaining that level of detail. Its always a battle to get the look that Bryan or Mark wanted. On top of that, we had whole spark systems and everythings driven off the bullets and every four bullets was a muzzle shell, so theyd have to time that and then theyd hit Supermans chest, so based on the topology of his chest, every bullet would be flying off in different directions. Although it was based on one large system, you need to leave room for is when you get the comment for one specific bullet or one specific spark or where to make this grouping over here. There were constant discussions about where we wanted the smoke so we could see the S or his face in particular frames.

We worked on the Superman Falls sequence from outer space sequence. We went into the lower atmosphere and where he falls into the city. For us, it involved some digital Superman work and then actually doing Metropolis from high above and a little bit into the city and then down into the park, and then we had to create the crater that he falls into as well. We did traffic systems and city layouts and a combination of CG buildings and matte paintings. We strived for a consistent photoreal look.

Framestore delivered 313 shots, including the final climactic showdown between Superman and Luthor, and the CG environments of oceans, crystal rocks, water interaction, seaplane, helicopter and Superman himself. 

Framestore delivered 313 shots, including the final climactic showdown between Superman and Luthor, and the CG environments of oceans, crystal rocks, water interaction, seaplane, helicopter and Superman himself. 

Framestore to the Rescue

Framestore CFC was the only London vfx facility on Superman Returns. Under the supervision of Jon Thum, Framestore delivered 313 shots, which spanned the entire length of the final reel in the movie. This involves the final climactic showdown between Superman and Luthor. The work encompassed huge CG environments of oceans, crystal rocks, water interaction, seaplane, helicopter and Superman himself, all mixed with 2D elements of mist, waterfalls, layered skies and various greenscreen elements. There was only one partial set built for all of this action, so the contribution was substantial. Arguably the most exciting sequences involves a seaplane falling off the top of a waterfall, as well as the small matter of lifting an entire island out of the ocean and into space.

Our CG pipeline consisted of Houdini and Maya feeding into RenderMan, then composited in Shake, Thum explains. Houdini was used for all the tricky stuff such as oceans, dynamics and particles. Maya was used for modeling, layout and animation. One of the main areas of R&D was the ocean. Led by CG supervisor Justin Martin, we used Gerstner waves for the mathematical model, and then developed mapping techniques for live-action foam onto the surface, combined with procedural foam and particle effects. In addition, we had to break up the (CG) set, smash crystal columns and break up rocks from the rising island. For this we pushed Houdinis dynamics to the limit, expanding its choreography abilities, and building on previous techniques developed for earlier shows such as Blade 2, Harry Potter 2 and Thunderbirds.

Over the course of eight months, we ramped up from a small core team to approximately 20 tds, 20 compositors and six matte painters, with support from approximately four matchmovers and four paint/roto artists. Because of the huge scope of our images, our matte painters (led by matte painting supervisor Martin Macrae) played a crucial part, especially for shots that were completely CG, in which we were starting with a blank canvas. Their artistic eye helped create compositions and lighting schemes that instantly worked. Photoshop and Maya were their main tools.

Because the film was shot on the Genesis camera, the first thing I did was install a digital projector on our floor, turning our Avid suite into an ad hoc projection room. Color space was an issue at the start of the project. Based on the graded dailies from production we were seeing, we developed our own LUT to approximate the look whilst keeping all blacks and highlight information visible. Toward the end, I would spend most of my day in that room screening dailies. The benefits of this are enormous, as I could see 2K projected large on a large screen in the final delivery format mere minutes after the renders were complete. One of my initial worries about HD was that of sharpness. With film, we would usually be able to integrate our CG with blur and grain, but the Genesis is so sharp, and with so little noise, that this would not be possible. The main effect of this was that greenscreen keys had to be that much more precise, and for CG we had to increase the filter settings on our renders to cope with aliasing problems that would otherwise be lost in the blur.

The island flyover shots by Framestore were comprised of several techniques, which included procedural textures for the crystal rock, 3D waterfalls, projected 2D elements and projecting matte paintings onto the rendered geometry to create extra detai

The island flyover shots by Framestore were comprised of several techniques, which included procedural textures for the crystal rock, 3D waterfalls, projected 2D elements and projecting matte paintings onto the rendered geometry to create extra detai

The island flyover shots were done with a combination of techniques that were repeated throughout our shots. The first, pioneered by effects supervisor Mark Hodgkins, involved procedural textures for the crystal rock, 3D waterfalls and projected 2D elements where possible. The second, led by Martin, involved projecting matte painting onto the rendered geometry to create extra detail a technique that produces better results, but only works for small camera moves. Our compositing supervisor, Gavin Toomey, was on set along with myself to capture all the elements we might need for our shots, and these were layered on top. These methodologies defined our approach to all our shots. A partial set was built for the Superman/Lex confrontation, and for this lengthy sequence we used a mixture of matte painted backgrounds, 2D mist and waterfall elements and, in some shots, a complete 3D reconstruction to extend the set upwards and outwards. Lightning flashes throughout the sequence made our work that much harder, as we had to light the CG to match each flash.

The scenes of Superman drowning were shot in a greenscreen tank, and for the underwater look we adapted the set-ups we had established for Harry Potter 4. This involved adding foreground particulates to the water, background CG crystal rock, overall murk and chromatic aberration effects. This look was then picked up by the other houses for their underwater scenes. The seaplane was also shot in a tank, and the surrounding water had to be replaced with CG water and blended where necessary with the real tank water. Some cloning of the real water foam was used to help this integration. The canyon itself is (of course) CG, as is the water in the canyon. As the seaplane tries to take off through the CG canyon, we used some real seaplane elements mixed with CG seaplane where necessary. Although some elements were shot with a real seaplane, it did not match precisely the set seaplane, so for any close up shots we would replace the top half of the seaplane with CG, whilst keeping the real pontoons with their white-water interaction.

The crystal island was created from a mixture of procedural textured geometry, with additional matte painting in some areas to create more detail. Framestores greatest challenge was to interact the CG ocean with the island. 

The crystal island was created from a mixture of procedural textured geometry, with additional matte painting in some areas to create more detail. Framestores greatest challenge was to interact the CG ocean with the island. 

As Superman flies above the clouds, we generated several cloud layers to provide movement, and this cloud layering technique is repeated as Superman shoots down to the island and into the sea. We created a molten rock seabed for Superman to crash into by combining a CG model and matte painting. Particle effects and 2D elements were used for the impact and water-boiling effect. Then, to destroy the set that contains Lexs lair, we used techniques developed in Houdini as mentioned above. With a matte painted set as a backdrop, the falling column smashes into crystal rocks and crushes them as it falls. We did many simulations of this effect before we found one we liked, as the simulation had to work across about six shots for continuity issues. To break up the floor of the set as Lex tries to take off, we used similar techniques, and the result combines 3D geometry break up with matte painted floor and wall, CG helicopter, CG particle debris, 2D mist and smoke elements.

The crystal island itself that rises from the sea is created from a mixture of procedural textured geometry, with additional matte painting in some areas to create more detail. The greatest challenge here was to interact our CG ocean with the island. The CG ocean itself was simulated in Houdini as described above, and each shot had to be choreographed to suit the action. The final interaction was painstakingly created using 2D mist and splash elements, as well as 3D particle effects and projected elements that were targeted to specific areas of the composition. For the shots where rocks are breaking away from the island, there were many elements involved in addition to the CG break up effect, such as 3D water effects, smaller particulate debris and 2D water elements. The Kryptonite shards that break through the rock were rendered with two shaders, one rocky and one Kryptonite, so that we could matte through from one to the other to create a cracked/veined effect. Once in the clouds and in space, the island is all CG, and for this we had to do a lot of optimization to the model in order to render it. As the island goes through the clouds, we used a combination of 2D layered and 3D clouds to have the island push them out of the way.

There was some collaboration with other houses, mainly with crystals, which are a theme throughout the film. Although there were plenty of discussions, we all came more or less independently to a method of growing our crystals. We used a technique where we grew the crystals from the outside in, translating outer shards upwards to increase the girth, and with a minimum of scaling in order to preserve the texture. In addition, other houses supplied some models seaplane, helicopter, Superman and we had a few shared shots with Sony where we would supply the background and they would supply the hero (large on-screen) Superman. There were some looks that we led the way with, such as the underwater look and seaplane interiors, and in these cases our shots were passed to other houses for them to match to. In return, we matched to Sony for their heat vision effect.

However, the biggest hurdle Framestore encountered was when the end sequence of the island rising out of the ocean was recut and the island put back to concept. This happened some six weeks from the deadline and required some radical reorganization.

Its hard to list all the changes, Thum continues. It affected each shot in a different way. The main change was to make the island less spikey and thus reveal more column and rock detail that were previously hidden by the spikes. Another was to flatten the top of the island completely, where before we had more of a hump in the middle. In addition, Bryan created two completely new shots to help the recut, and then wanted a transition from one of our biggest shots through to a new plate shot six weeks from our deadline.

It was more important that our look matched Rhythm & Hues work because they were doing the same crystal rocks as us. We were both working in parallel on the look and more or less independently came up with the same idea, largely because it is the best solution to the problem. The problem was how to grow the spikes without stretching the texture. Because of the way the spikes are made up of several shards that form a larger piece, it was possible to grow them by translating shards up around the perimeter to increase its girth. The translation itself increased the length and because we never see the base, we dont have to worry where the translation comes from.

The feel of the growth animation was something that we all converged on, and this had to be the right mix of order and randomness. Generally, the crystals would need to arrive at a predetermined look, so the animation was rewound and randomly translated back toward that position. In reality, each shot had different speeds of animation to suit the story, but there was a definite feel that was developed and because we were all seeing what the other houses were doing, it was easy to maintain this.

Bill Desowitz is editor of VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.