Henry Turner investigates the newest developments in motion capture and motion control, which brings the technology on-set. Includes a QuickTime movie clip from Hidalgo.
If you have the QuickTime plug-in, you can view a clip from Hidalgo by simply clicking the image.
Alberto Menache of Sony Imageworks is enthusiastic about his work as senior CG supervisor on The Polar Express. We are really pushing the envelope of what can be captured. We have multiple stages where we are capturing data with almost 80 cameras at a time. It is incredible. On one set we were tracking bodies and faces together, in 360 degrees. This is something that has never been done before.
Set visitors would be startled by the look of the production. All the actors were walking around in color-coded spandex suits. We had the wardrobe department dealing with the spandex suits and the body markers, and we had the make up department dealing with face markers. The actors would show up at 5:30 in the morning and go to make up to be fitted with face markers, and then go to wardrobe where they got their spandex and body markers. The make up and wardrobe people were trained to know where the face and body markers had to be. Not only that, they were trained to check between shots that all the markers were still in the right positions.
The shooting ran like a real, live-action movie. The cameras are 1,000 line digital cameras that have a light ring around the lens, and the light ring shines red light onto the markers that are covered in Scotchbrite, so then the camera receives the reflection back. What the software does is extract everything that is not highly reflected, and it does some image processing on the frames. You really only need two or three cameras to see a marker to show this; the reason why you want a lot of cameras is redundancy, to create a fully 3D figure from any angle. This is the state of the art in motion capture systems today.
In the future we will be able to track in realtime the actors and the environment and all the props on location, which we cant do now because all the optical systems are too sensitive to lighting. If you light your set with 10K lights, the current systems stop looking at the markers. Thats why we do motion capture in a studio in most cases. But in the future, we will be doing it on location. There is some developing technology that will allow us to track without the need of markers. We will be able to collect data throughout the whole shoot without worrying about it; it will be tied to the time code, and we will be able to use it later.
Tim Alexander (left) of ILM covered all different angles and light conditions for the backgrounds for this bluescreen work of Viggo Mortensen running on top of a wall in Hidalgo. © 2004 Touchstone Pictures. All rights reserved. Courtesy of Industrial Light & Magic.
Meanwhile, Industrial Light & Magics Tim Alexander, visual effects supervisor on Disneys Hidalgo, used his ingenuity to find ways of creating preliminary composites while shooting the film. We were in Morocco and I had my Power Book with me and a digital still camera. I would take stills of locations where we planned to shoot our plates, or at the location where they were shooting that day. I would then take those digital photos back to the hotel room with video from the video tap. The video assist used miniDV, so I just borrowed the deck and plugged it right into my Mac. I imported the footage into Final Cut Pro, and then composited the shots, sticking them together on the computer using Combustion. When we went to nightlies and I could show the images I created to [director] Joe Johnston, so he could buy off on a location, or see how a composite might be going together.
In one case, Joe knew that he was going to come back and shoot this bluescreen sequence of Viggo Mortensen running on walls. He wanted us to cover the backgrounds for that sequence while we were in Morocco. So, not really knowing what the light angle was going to be, or exactly what he wanted, we started shooting tons of stills. I thought Joe was probably going to want to do a dolly move hed want to be running with Viggo. The problem is that if you do a dolly move or a booming move, there are a lot of perspective shifts going on, which you dont get from a still. So we mocked up different camera moves on the still photos. Once Joe bought the idea that we could move these backgrounds in a 3D way, then we could go out and cover all different angles and lighting conditions, so we were prepared when we came back and actually shot the bluescreens.
ILM has created previs systems allowing filmmakers to preplan complex scenes on the spot. We have technology that can put in the digital character or background, all in realtime during the shooting. The system tracks the camera so it knows what the angle is, and then renders in the CG character at the right perspective and angle, so you can see how well everything is lining up. With this, you can test motion capture, and the angle of the background. They used it on A.I. for the City of Sin a lot of that city was computer-generated. We had a low-res model of the city in the computer, and in realtime we could go around and pick camera angles and then move the 3D model as well, to get nice dramatic scenery. Then, based on that camera information, we put in the real CG city. Despite the low-res, the system gives a much better sense of what the composition of the shot will be like. Its a much superior method than just looking at blue screens. Ive seen uses of it where they are doing motion capture and they actually previs the motion capture into the plate. This way, you can see how thats going, in real time, as you are working on it, so that if the performance isnt just right, youll know at the moment.
A Passion for Details
For the effects work on The Passion of the Christ, visual effects supervisor Ted Rae used motion control techniques in instances where he felt that motion capture was too limited. The sequence where Jesus is scourged in the courtyard involved digital effects. There are times when the wounds that are revealed were makeup applied to Jim Caviezels body and filmed on set. After the film was edited, we covered up the wounds with digitally painted pieces or with real skin elements of a stand-in that were tracked and warped to match Jims movements. We then roto-wiped those skin elements off, to reveal the wounds that were already there.
To add these elements, it was necessary to recreate the scourging scene in the studio. Rae chose to use motion control to track the camera movement. Caleb Deschanels shooting style keeps the camera moving almost constantly. For visual effects, that style can be problematic unless things are well planned and lots of information is gathered on set. When Jesus was moving and the camera was moving it created a lot of axises to start tracking in post. As far as I know, there is no tracking software that can do it, unless you motion capture, and I dont know of any motion capture system that can track a small portion of a human body with the accuracy that we needed. So there were two problems we had to solve. The first was documenting the camera movement, and the second was deriving the information necessary to recreate how Jesus was moving. I knew that by using motion control, we could accurately recreate, down to a thousandth of an inch, the camera movement. So we took that data and recreated the courtyard set-up back here on stage. But mind you, this was not the set-up for the entire courtyard, but only the area involving Jesus body. The make up was then reapplied to a stand-ins back. Then while watching playback of the motion control cameras video tap, the stand-in pantomimed the movements Jim had done on set. We blended separate takes together, if necessary, creating photographic elements running in realtime with the same camera speed, shutter angle, lighting, color temperature and make-up. Only then did we manipulate those images digitally during compositing.
In order to give the stand-in a guide for his movements, an interesting superimposition system was set up, which amounts to a sort of realtime previsualization. In addition to Calebs photography, we also shot Jims body with three commonly synced digital cameras. When these images were played back in the studio, the stand-in could see the playback, which was running on a mix, through the video tap on the camera shooting himself. The stand-in could see his own body on top of Jims he basically saw himself matted on top of Jim. We looped the playback, because we found if he could go through the motion, and do it repeatedly, by the time we got the seventh or eighth take, he was very, very close to matching Jims movements.
Cinematographer Allen Daviau combined previs and Cablecam for this shot in which Draculas brides fly in Van Helsing. © Universal Pictures 2004.
A Veteran Speaks
The great cinematographer Allen Daviau, whose latest work will be seen in Van Helsing, maintains that while previs is a useful tool, it does not eliminate spontaneity on location. Planning something like Van Helsing, you naturally have very detailed planning. We had animatics that showed what was to happen in a scene, such as when Draculas brides come flying into a village. We were working with some very complex shooting systems like the Cablecam, which takes a lot of rigging to set up. With Cablecam you can literally make the camera fly, and it is a wonderful device, but a big deal to set up. By having the previs everybody can see what needs to be done. But you always do alterations during production, because you discover something better in the course of shooting.
Henry Turner is a writer and award-winning filmmaker, whose Lovecraft-inspired horror feature, Wilbur Whateley, won top awards at the Chicago International Film Festival. His writing on film has appeared in the Los Angeles Times, Lecran Fantastique, Variety and many other publications. A longtime film festival executive, he has programmed for the Slamdance Film Festival, and currently heads FilmTraffick L.A.