ANIMATION WORLD MAGAZINE - ISSUE 4.11 - FEBRUARY 2000

The Future Of Motion-Capture Animation:
Building The Perfect Digital Human

by Laura Schiff

If you have the QuickTime plug-in, you can view Webbie talk the talk. © Giant Studios.

"I'm an avid surfer and I love to dance," the Elite model told WABC-TV Eyewitness News last December, adding that she also enjoys sports and working out at the gym. Who is this athletic beauty? None other than Webbie Tookay, the world's first virtual model, making her digitally animated debut via satellite feed. Webbie is the brainchild of Illusion2K, the "virtual management" division of Elite Models Inc., and Giant Studios, a motion-capture and performance animation studio in Atlanta. Webbie's television "interview" served as a marketing tool to demonstrate the capabilities of Biomechanics Inc.'s cutting-edge 3D motion-capture system, called Motion Reality. Though the low-resolution version of Webbie that we saw was crude and rudimentary, Giant insists that photo-realistic "digital humans" are right around the corner. They should know: they're the exclusive worldwide reps for the Motion Reality system.

The Next Step
Motion Reality is the leading motion-capture animation software to go beyond the limitations of visual markers, evaluating and utilizing as much information as possible about the motion source it's capturing, be it person, animal or animatronic rig. "The system is optically based, and we do use reflective markers [to track motion]," says Matt Madden, Giant Studios' Director of R&D (research and development). "That's certainly a big part of the capture process and important to calculating motion, but there's only so much information you can get from a set of markers. What do you do when those markers aren't visible? For example, when you make a fist, you can't see the bottom part of your fingers, but you certainly have a real good idea of where those fingers are. That kind of intuitive information is what we put into the software. It becomes smarter, essentially."

To accomplish this, the Motion Reality software utilizes a very detailed algorithmic formula to define the kinetic properties of the motion source. Explains Madden, "We get a clear understanding of what the person or source is comprised of, right down to a person's exact bone length, for example. We have a specific scaling process so that the software can figure out the specific dimensions of the subject that it's capturing. It has to know all those lengths and ranges of motion and even other things like connective tissue and ways to stabilize this skeleton. It evaluates all of [a source's] movement properties, as well as the different forces involved in generating those movements... and then our software tracks and creates skeletal transformations. This gives us more information for enhancing or modifying the motion" through 3D animation.

Watch this Quick Time movie of the preparation for a motion-capture session. © Giant Studios.

Another major advantage of the Motion Reality system is that it operates in real-time. "There are other optical systems that operate in real-time," says Madden, "but they are very limited to the type of motion and number of characters that can be captured. I've only seen them capture one character. With Motion Reality, we are doing three characters in real-time and are working on capturing four for a project due this spring. We can also do on-set captures, which means motion-capture on a live studio set, so you can capture and direct both the live actors and CG characters interacting in real-time. The animator or director is getting real-time feedback of how the end product looks, and they can evaluate that on the fly and say, `Okay, that's not quite what I'm looking for,' then modify it for the operator or engineer. So you're creating and defining and developing a new style for that particular character in real-time. I am sure that no one else has this capability right now."

1 | 2 | 3


Note: Readers may contact any Animation World Magazine contributor by sending an e-mail to editor@awn.com.