Search form

'I, Robot' and the Future of Digital Effects

Alain Bielik meets the visual effects supervisors of two effects studios to uncover the truth behind Alex Proyas robots revolution.

Digital Domain completed more than 500 shots of the robots, including 300 of Sonny (above left). Photo Credit: Digital Domain. All I, Robot images  and © 2004 Twentieth Century Fox. All rights reserved.

Digital Domain completed more than 500 shots of the robots, including 300 of Sonny (above left). Photo Credit: Digital Domain. All I, Robot images and © 2004 Twentieth Century Fox. All rights reserved.

Whenever one thinks about robots in the movies, the images that come to mind are flashes from Metropolis, The Day the Earth Stood Still, Forbidden Planet, Star Wars, Terminator or RoboCop. Dozens of other movies have featured robots, but the design of these machines never left a lasting impression in the viewers mind: The Black Hole, Lost in Space, Red Planet, A.I. Artificial Intelligence The task of adapting The Robots, Isaac Asimovs classic series of short stories, was thus a daunting one, as robots would this time be the focus of the whole film. Set in the year 2035, I, Robot follows an investigation by Detective Spooner (Will Smith) of the murder of a scientist. His only suspect is a robot named Sonny. Although built-in security programs make it theoretically impossible for robots to do harm to a human, could the impossible have happened?

Patrick Tatopoulos, production designer and robot creator, was inspired by the transparency of his assistants iMac while designing Sonny and his fellow robots.

Patrick Tatopoulos, production designer and robot creator, was inspired by the transparency of his assistants iMac while designing Sonny and his fellow robots.

Tapped with the task of designing Sonny and friends was French production designer and robot creator Patrick Tatopoulos, who had previously worked with director Alex Proyas on Dark City: We knew that they should look sleek, attractive and harmless and that they had to blend in everybodys home. We couldnt have a Terminator standing at the stove in a kitchen! After trying a lot of different approaches, I nailed the concept when I noticed my assistants transparent iMac. It just hit me: everywhere, from architecture to consumer products, the current trend in design is transparency. So, I came up with the idea of combining half-transparent shells and a thinly mechanized armature. Alex bought the concept right away. The soft features of the face were inspired by the traditional imagery of angels.

To put robots on screen, overall visual effects supervisor John Nelson had three options: animatronics (the solution of choice for Terminator and A.I.), a man in a suit (RoboCop, Bicentennial Man) or CG animation (Red Planet, Tomb Raider). The thin bodies of Tatopoulos robots excluded the use of a man in a suit, while the range of movements that the characters had to be able to achieve was far beyond the realm of what animatronics could do. Nelson thus decided that the robots would be created via computer animation, a task awarded to Digital Domain and in-house visual effects supervisor Erik Nash. We did more than 500 shots in 14 months, including about 460 shots featuring CG robots, recalls Andy Jones, DDs animation supervisor. On Sonny alone, we had almost 300 shots!

Sonny follows in the digital footsteps of Gollum. Actor Alan Tudyks performance is featured in 80% of Sonnys screen time. Photo Credit: Digital Domain.

Sonny follows in the digital footsteps of Gollum. Actor Alan Tudyks performance is featured in 80% of Sonnys screen time. Photo Credit: Digital Domain.

Walking In Gollums Footsteps

Plate photography of the robot scenes required a minimum of four passes in order to provide the necessary elements. In a typical Sonny shot, the crew first photographed Will Smith playing the scene with actor Alan Tudyk completely dressed up in green as the robot. Then, Smith repeated his performance without Tudyk. The third pass featured a full-size Sonny puppet built by Patrick Tatopoulos Design that was moved around the set on a dolly as a light and texture reference. The final pass was a clean backgound plate of the empty set.

Alans performance gave us the outline of what Sonny should be doing, which is a similar technique to the one that was used for Gollum in the Lord of the Rings movies, Jones comments. Initially, the plan was to use his performance as a guide to animate our CG Sonny and integrate the character in the plate in which Will Smith had been shot alone. However, it turned out that the actors performance was often better in the plates featuring Alan: the intensity was there, the eyelines were correct It was obvious that the scene worked better whenever Alan had been part of the action. As a result, about 80% of the Sonny shots ended up being Tudyk plates, which implied painting him out in more than 200 shots. It was an enormous task, especially since the plates had been photographed without motion control. In tracking shots, the parallax on the background wouldnt match, which made it extremely complicated to copy background elements from the clean plate.

Will Smiths scenes were shot with and without actor Tudyk. Next, a full-size Sonny puppet was shot for lighting and texture reference, and finally an empty set for a clean background plate was photographed. Photo Credit: Digital Domain.

Will Smiths scenes were shot with and without actor Tudyk. Next, a full-size Sonny puppet was shot for lighting and texture reference, and finally an empty set for a clean background plate was photographed. Photo Credit: Digital Domain.

The obvious solution to animate Sonny was to simply rotoscope Tudyks performance, but Jones opted instead for motion capture. Proyas had made it clear that the animators were free to alter Tudyks body language as long as his facial expressions would be faithfully reproduced. Given that the robots had many walking and running scenes, keyframe animation wasnt even considered. Most people think that animating a walk cycle is easy but the opposite is true, Jones remarks. Theres so much detail in the way that we walk. It takes forever to keyframe it and still, you never get it quite right. With motion capture, youre 95% there. It actually takes the pain away from key-framing a walk cycle and gives you time to focus on the more creative aspect of the animation.

Capturing the Action

Motion capture duties were handled by Motion Analysis Studios with Scott Gagain, vp of project development, and Jeff Swanty, head of production, coordinating the effort. Once the live-action set of a scene was mapped up on the MoCap stage, actors performed the robots action, up to four at a time, covered up with 48 tracking markers each. Since live action had been shot without motion control, camera angles were matched by eye while the actors tried to mimic what they had done during principal photography. Performances were captured by 22 cameras and applied to CG skeletons by a series of proprietary software. We had two monitors side by side, one playing back the live-action plate and another one playing the motion capture animation, Jones explains. It allowed us to time the performances and get the best possible match in terms of framing and action.

Animation supervisor Andy Jones (left) and visual effects supervisor Erik Nash for Digital Domain were challenged by the sequence in which Smith walks into a hangar with 1,600 robots.

Animation supervisor Andy Jones (left) and visual effects supervisor Erik Nash for Digital Domain were challenged by the sequence in which Smith walks into a hangar with 1,600 robots.

Once the MoCap data was converted into Maya files, Digital Domain wrote a program that allowed the animators to use both keyframing and motion capture within a single shot. First developed by Jones on Final Fantasy and The Animatrix, the tool mapped motion capture data onto keyframe controls. With this technique, a shot could start with 20 frames of MoCap of a running robot, then continue with 15 frames of a keyframed jump and resume with 20 frames of a motion captured walk cycle, all in one fluid continuous move.

Jones crew met a real challenge with a sequence in which Spooner walks among thousands of robots in a gigantic hangar. We thought itd be easy, Jones recalls. After all, the robots were standing still and we only had to run some animation cycles However, since the camera was moving, the shots featured a lot of parallax changes, which meant that we couldnt simply project robot textures onto cardboard cutouts. In the tracking shots, they had to be three-dimensional models, a pretty amazing undertaking considering that there were 1,600 of them on screen. Interestingly enough, the script called for 1,000 robots but it just didnt look enough once the scene was completed. So we kept adding robots until the shots finally looked right. The hangar itself was also completely digital, except for the portion around Will Smith. In the end, we had to pull every trick in the book to make this sequence work.

Weta Digital was called in to create the car chase sequence in which Smith is attacked by robots. They completed the complex scenes in just 10 weeks. Photo credit: Weta Digital.

Weta Digital was called in to create the car chase sequence in which Smith is attacked by robots. They completed the complex scenes in just 10 weeks. Photo credit: Weta Digital.

Attack of the Clones

Although Digital Domain remained the main provider of robot animation, Weta Digital was called in late in post-production to help complete the ambitious effort. We started late in March 2004 and had only 10 weeks of full production time to deliver almost 300 shots, observes in-house visual effects supervisor Joe Letteri. It was a tremendous amount of work in a very short period of time. The main sequence that we worked on was the car chase. Spooner is attacked by two robot-driven trucks in a tunnel. When the robots fail to crush him with their vehicles, they call in for back-up and soon Spooners vehicle is swarming with dozens of machines, up to 160 units at the climax of the chase.

Live-action plates were shot on a greenscreen stage with Smith performing the action in a car mock-up. The exterior of the vehicle was later completely replaced by a digital version and inserted into a CG tunnel. The timing of the animation was critical as the spacing of the light fixtures in the virtual environment had to be synchronized with the interactive light effects that had been used in the live-action photography. Working in Maya, animators created the complex choreography of the two trucks attempting to crush the car, and the movements of up to 50 robots at a time.

Joe Letteri, visual effects supervisor for Weta, also created the digital Chicago skyline.

Joe Letteri, visual effects supervisor for Weta, also created the digital Chicago skyline.

Creating the Future

Besides robot sequences, Letteris crew also tackled the creation of the futuristic Chicago skyline. Working from production paintings by Tatopoulos, Weta Digital used live-action plates of the Windy City to form the basis of the shots whenever possible. The first step was to map out a skyline in sketch form and submit it to Proyas. The concept was then blocked in digital form, once again submitted to Proyas, and finally redone in high resolution. We built about 30 high resolution buildings for the foreground, Letteri explains. In the mid-ground, we had medium resolution buildings that were built from pieces of the main structures. By reorganizing the pieces and moving the buildings around, our cityscape would never look the same.

The virtual Chicago was populated by extras that were either actors shot greenscreen for tight shots or CG models animated in Massive for long shots. We also had six to eight different CG cars and about six truck models, Letteri adds. There again, by mixing pieces, we got enough vehicles to make a whole fleet. Weta Digital also executed many digital set extensions, most notably on the glass-walled lobby of the US Robotics building, a responsibility shared with Digital Domain. All the robot action was executed in keyframe animation, Letteri notes. The movements were too extreme to be motion captured. In several shots, Will Smith himself is a digital double. We used a CG model that was provided by Digital Domain a very nice model, by the way. For the explosion that concludes the sequence, we had originally planned to use a miniature, but the action turned out to be too fast and too extreme for this approach. The explosion was eventually done in CG with particle systems and a series of tools that we had developed for the destruction of Mount Doom in Return of the King.

Although its still hard to tell if Sonny will have the lasting impact of a C-3PO or a T-800, Proyas, Tatopoulos and the crews at Digital Domain and Weta Digital have definitely succeeded in creating a unique character. What Im the most proud of is that Sonny is the third biggest character in the movie and hes completely digital all the time, Jones concludes. It was quite a challenge as I, Robot was Digital Domains first venture into large-scale character animation.

Alain Bielik is the founder and special effects editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications and occasionally to Cinefex.

Tags 
randomness