Search form

'Superman Returns': The Passion of the VFX — Part 1

In the first part of our Superman Returns coverage, Bill Desowitz explores how Sony Pictures Imageworks advances its animation techniques to enable The Man of Steel to fly.

The Shuttle Disaster sequence in Superman Returns represented challenging vfx work for Sony Pictures Imageworks. All images courtesy of Warner Bros Pictures. Credit: Sony Imageworks. 

The Shuttle Disaster sequence in Superman Returns represented challenging vfx work for Sony Pictures Imageworks. All images courtesy of Warner Bros Pictures. Credit: Sony Imageworks. 

When director Bryan Singer undertook Superman Returns (which opened June 28 from Warner Bros. Pictures), his first priority was to make sure the critical flying scenes were as believable as possible. To that end, visual effects supervisor Mark Stetson chose Sony Pictures Imageworks, a studio he was very familiar with, having worked there as supervisor of Peter Pan and Charlies Angels: Full Throttle.

Imageworks really stepped up with the animated Superman work, Stetson suggests. The image-based render approach is very good, especially the close-up, high-res work; the cape sim is very good too: its very fluid and very consistent. [Overall] the animation is head and shoulders above what Imageworks did on Spider-Man 2, which was remarkable. And the Shuttle Disaster [the main action set piece] will be a real landmark for them.

Building a digital version of Supermans famous red cape that could be directed as if it were a character in the film was one of Imageworks challenges. 

Building a digital version of Supermans famous red cape that could be directed as if it were a character in the film was one of Imageworks challenges. 

Imageworks artists, led by visual effects supervisor Richard Hoover (Armageddon, Unbreakable, Reign of Fire) and animation supervisor Andy Jones (I, Robot, Godzilla, Final Fantasy: The Spirits Within), were tasked with creating a digital double of Brandon Routh that allowed viewers to see Supermans death defying acts of courage not only from a distance, but, more crucially, in close-ups as well.

The Imageworks team had several challenges: to build a digital version of Supermans famous red cape that could be directed as if it were a character in the film, including reactions to all that was going on around it, independent of natural forces.

Building a better digital human, in the form of Superman, was not an unusual task for Imageworks. Many of the artists on the project had been part of innovative digital human work on both the groundbreaking Final Fantasy and the Oscar-winning Spider-Man 2.

Body Men: Animation supervisor Andy Jones (left) helped create a digital double of Brandon Routh, while digital effects supervisor Alberto Menache was in charge of the facial rig.

Body Men: Animation supervisor Andy Jones (left) helped create a digital double of Brandon Routh, while digital effects supervisor Alberto Menache was in charge of the facial rig.

Creating a realistic digital double of an actor is one of the greatest challenges in visual effects: the digital double needs to look the same as the real actor from every viewpoint, in any lighting, with any facial expression. To help ensure that Rouths digital double looked as realistic as possible, he was scanned in the Light Stage 2 device at the University of Southern Californias Institute for Creative Technologies (USC ICT) in Marina del Rey, Californa. Light Stage 2 consists of 30 Xenon strobe lights arranged on a rotating semicircular arm 10-feet in diameter.

The device illuminated Rouths face from 480-light positions covering every direction light can come from in just eight seconds. A specially designed rig built by Imageworks Nic Nicholson allowed his hands to be photorealistically captured as well. As this happened, he was filmed by six synchronized Arriflex movie cameras.

The resulting frames were digitized at 4K resolution and texture-mapped onto a laser-scanned geometric model of Rouths face. Since the texture maps encoded all possible lighting directions, they could be digitally remixed to show the virtual actor photorealistically reflecting light in any environment the story called for. In this way, the complexities of how his skin reflects light its pigmentation, shine, skin luster, shadows and fine-scale surface structure was synthesized directly from light reflected by the actor himself. This data was given to the Imageworks artists who painstakingly created the fully realized walking, talking and flying superhero.

Creating a realistic digital double of an actor is one of the greatest challenges in visual effects: the digital double needs to look the same as the real actor from every viewpoint, in any lighting, with any facial expression. 

Creating a realistic digital double of an actor is one of the greatest challenges in visual effects: the digital double needs to look the same as the real actor from every viewpoint, in any lighting, with any facial expression. 

In Superman Returns, his cape is as much a character as the Man of Steel himself. To create a cape that could be directed in the same way an actor would be, Singer asked the Imageworks team to build a digital cape that would be worn by both the digital Superman and the real actor in every scene. However, the Imageworks crew soon discovered that not all cloth software programs are created to bring this amount of directing flexibility to fabric. Syflex cloth simulator was used to tackle the cape challenge. The software was a perfect base program for the Imageworks team to use in building a cape that would display a distinct behavior and attitude. Nice sim to art direct.

We began animation R&D in October 04, Hoover elaborates, which ran through Jan. 05 to coincide with beginning of principal photography. The stunt crew in Sydney worked on different flying and cable rigs and approaches to how to fly him on stage while we simultaneously figured out poses and how he might fly. The end result of that R&D was several test shots where we took backgrounds in our archives and took CG shots of him flying, landing and talking. So in order to do that we picked up where Spider-Man had left off in terms of making a digital character. We reviewed that process and talked about how to improve it and how to capture the textures better using the USC Light Stage. So we did things a little differently: We added more cameras and set them up differently. We then reworked our shader pipeline in how we use that information to render our character. And then, of course, built a costume and scanned Brandons body and built all that for test shots.

An unexpected phase two of R&D during principal photography was put into action when Brandon Routh appeared in much better shape while preparing for the movie. Subtle alterations to the final costume were then made. 

An unexpected phase two of R&D during principal photography was put into action when Brandon Routh appeared in much better shape while preparing for the movie. Subtle alterations to the final costume were then made. 

However, there was a phase two of R&D during principal photography that wasnt anticipated when Routh was in much better shape in preparing for the movie. So we scanned him again but didnt reshoot the capture because he was in Sydney and we were in Marina del Rey; we reapplied the same textures to his new scan. And there had been subtle alterations to the costume that we had to rebuild for the final costume.

As with previous vfx-intensive live-action movies of this sort, Sony performed MoCap for basic movements only. But we did a very extensive facial MoCap where all the cameras were pointed in a small environment, maybe 12 feet in diameter, Hoover continues. We did an extensive expression library as well as phone-ins for vowels and consonants so we could deliver dialog too. One of the test shots was our CG guy and Brandon delivering dialog side by side as proof of concept. Again, the MoCap was just used to build the blend shapes and the muscle system to drive his face. We never used it for performance. All the performance was done keyframe for the entire film.

Not surprisingly, the CG cape was one of Singers biggest concerns. Sony started in R&D using in-house cloth tools that it had developed over the years, but were not satisfied with the results. So Jones obtained a beta version of the Syflex cloth simulator and ultimately came up with a series of tools from which to start. That allowed us to get a really nice sim from that software and then over the course of trying different shots and speeds we came up with a vocabulary and look Bryan liked. There was a limit. We looked at a lot of reference like large flags blowing in hurricane winds to simulate 100 or 200 mph winds. But its really frantic and the material starts disintegrating. And there are shots during the Shuttle Disaster where Supermans going 12,000 mph. So, at that point, we did what looked cool.

Andy Jones gave the Sony animators an offset control for wind buffeting as a noise function. 

Andy Jones gave the Sony animators an offset control for wind buffeting as a noise function. 

One of Andys first R&D animations was to deal with pushing off something thats air born and selling the weight that youre holding on to. That was always something that we experimented with. And also selling throughout the movie just how strong is he. Obviously, to some degree, thats what the Shuttle Disaster is about: giving the audience an idea of what hes capable of.

We did extensive videography from multiple angles, so we could identify precisely where his joints are in his body and how they move and the range of motion in his body. That was our starting off point. The MoCap gave full motion around his face and how his skin moved and wrinkles formed; how he formed certain expressions with his face. We built a character where the default pose was Brandon. But then we put controls in that allowed us to shrink or expand his muscles. And we gave the animators both tools. Sometimes we went too far. Bryan would complain that he looks too stocky and wed have to deplete him a bit. But just based on the lighting or the lighting angle sometimes the muscles didnt show up, and wed exaggerate in those cases.

As always, Sonys animation was Maya based, including the proprietary muscle system. And they rendered with RenderMan. Sony strategically structured its shaders for the Light Stage rendering of Rouths face and hands. The team also applied the same parameters for his costume as for his hands and face in order to render everything together. That took quite a bit of time to work out. Lighters still had a lot of controls. In addition to working in different conditions for various poses and speeds for their own shots, Sony applied conditions for when Superman is wet and vulnerable to attack during a brutal fight for some of the other vendors, including Framestore CFC, which worked on the climax.

As for Jones, who came over to Sony from Digital Domain to supervise animation, there were only a few adjustments that had to be made to the studios pipeline with respect to the flying sequences. For the animators, I wanted them to have an offset control so they could add the wind buffeting as a noise function on top of it to be layered in with other animation of arms and legs. So if you wanted to do more specific posing, it wouldnt affect the other animation. This was very specific for this rig.

Digital effects supervisor Alberto Menache created the facial rig and a new muscle technique was created by character rigging supervisor Arthur Gregory and completed by CG modeling supervisor Edward Taylor. This was mainly to get the muscles under the suit moving properly and to obtain a more dynamic feeling to the suit.

Jones, who came over to Sony from Digital Domain to supervise animation, made only a few adjustments to the studios pipeline for the flying sequences. 

Jones, who came over to Sony from Digital Domain to supervise animation, made only a few adjustments to the studios pipeline for the flying sequences. 

We finally ended up using Syflex and it really paid off, Jones adds. Our CG hair/cloth lead, Takashi Kuribayashi, set up the cape animation and spearheaded that using Syflex instead of several tools, creating certain poses that the cape would reach at [specified frames] to direct the cape and drive the movement with a natural feel.

Believability for the flying is one thing, but what Sony achieves through Singers direction is an array of emotional states when the superhero takes flight. In fact, when Superman floats up into the sky with Lois Lane (Kate Bosworth), their seductive tour of Metropolis borders on the sublime.

But its the Shuttle Disaster that will wow everyone, when a launch from the back of a jet goes awry with Lois onboard. Along with flying in general, it was difficult to find Supermans fulcrum, Jones suggests. Wheres his center of gravity for a guy who defies gravity? How is he able to grab hold of a plane and where is his strength coming from? We derived that its coming from his center of mass near his mid-section. But it would change a little bit. With that in mind, we tried to get the force of how hes pushing back on the plane. It was always difficult. Selling the idea of how hes pushing the plane was a challenge.

The plane itself was difficult. The previs we got from [Pixel Liberation Front] was very good. They had done that sequence early on and it gave us a good idea of what Bryan wanted. With Bryan, it was always about giving it scale, giving it weight. The previs was done quickly so a lot of the movement in the plane was too fast. The way it was bouncing around, it looked more like a toy, so we were always battling to give it more scale and more weight. It slowed the whole thing down, but we wanted to make it look more dynamic.

For the action-packed Shuttle Disaster sequence, Imageworks modelers created a 777 airplane and a space shuttle model from scratch, which was a highly complicated endeavor considering that design specifications are not available to the public. Because both the plane and the shuttle exist in the real world, the Imageworks team was dedicated to creating exact digital replicas for Superman to save not one, but two damaged aircrafts from disaster.

The plane and shuttle were all CG, Hoover offers. We did research online. All the wings were constructed with actual structural support inside and all the sheet metal was like the real plane so that when Superman went through it and it broke apart, it would be where the natural seams would be. We also had some great footage of Boeing stress tests on the wing where they caved. We used that as reference. Its amazing: They bend the wing until it almost goes 90 degrees before it breaks.

The vfx team worked hard at building a very versatile character that was built as anatomically correct as possible to the way Brandon Routh moves. The directive for the team was to match the actor. 

The vfx team worked hard at building a very versatile character that was built as anatomically correct as possible to the way Brandon Routh moves. The directive for the team was to match the actor. 

Jones insists they tried not to cheat as much as possible, so a lot of the camera angles are accurate. There are a couple of shots where hes a little closer to the camera than you think he is so you dont lose him. If you look at the comic book stuff where hes lifting an oil tanker, he looks like an ant. We tried to make it more believable. Something Bryan had come up with was the idea that as every feat gets larger and larger you feel like its at the limit of his ability. So thats why it was so important to set up the plane feat so believably.

It was always difficult to find the right pose. When hes underneath the shuttle, we had a comic book essence combined with a believability that he could and would make this pose. Sometimes wed start with a very comic book pose and Bryan would make us take it back a bit and make it more like Brandon.

Meanwhile, during the destruction of Metropolis, Superman flies around doing other superhero feats. One shot I like to discuss in particular is a CG double that turns into a greenscreen, so we had a handoff, Jones recalls. Usually it can be pretty difficult. One shot when hes going through the tunnel and you pull right into a closeup of him blowing out the fire really surprised us. We animated the flying and landing and then where the transition takes place were closer to the camera than people actually think.

The end sequence when he flies through the clouds, that particular shot we were supposed to use the greenscreen when he comes up close to the camera, but for a variety of reasons we couldnt get the greenscreen piece to flow the way Bryan wanted, so we ended up using an all-CG piece and we really had to focus on facial animation. Bryan wanted to give him a proud feeling of floating above the earth at the end. Virginie Michel DAnnoville, one of the character animators, worked on that.

Hoover reflects on the animated flying accomplishment: We worked very hard at building a very versatile character that was built as anatomically correct as possible to the way Brandon moves. I directed everyone to go down the path of matching Brandon, not making the perfect character. Ive been in that situation before and its very difficult to match shots.

Bill Desowitz is editor of VFXWorld.

Tags 
randomness