'Avatar' and the Future of Digital Entertainment Creation

Autodesk's Marc Petit offers an inside perspective on the significance of Avatar.

Check out the Avatar trailers and clips at AWNtv!

Avatar breaks the barrier between live action and digital moviemaking, changing the way VFX movies are made and experienced. All images courtesy of Twentieth Century Fox.

The production of Avatar highlights some fundamental evolutions in movie making. Although a lot has already been written about the movie, its production is an important milestone for the industry. It means a lot to us at Autodesk, particularly, as it showcases production requirements that support the directions that we have been taking for the development of our products. Avatar pioneers virtual cinematography, features believable characters and delivers a fully immersive stereo experience. More than 15 of the most prestigious VFX and post-production companies around the world collaborated closely to make this movie -- and also enlisted a world-class games studio to produce a title that extends the story into the interactive world. Together, they pioneered virtual moviemaking in the process.

Marc Petit.

Previs, Storytelling and Virtual Cinematography Avatar marks the first time a director, James Cameron, has been able to direct computer-generated and live actors in realtime, in digital environments. The virtual cinematography set-up developed by the team at Lightstorm is probably the most advanced we've seen so far. It allowed extensive capture of the performance of the actors, including motion, facial animation and eye directions. The actors and the director saw the results live as MotionBuilder captured and mapped the actors' performances to their virtual counterparts and rendered them in realtime. As soon as each performance was approved, the shots could be sent over to Weta for final animation, rendering, lighting and compositing. But an added advantage to this digital workflow is that shots could be altered after the fact, giving the team more flexibility to test creative decisions for lights and cameras or even editorial.

With virtual moviemaking, after the performance is captured, the director can change or refine camera moves and angles. The performance is played back on a monitor/viewfinder that the director uses just like a physical camera that can be captured by MotionBuilder as well. The result is amazing creative flexibility --actors' performances and camera work can be decoupled!

Avatar, of course, relied heavily on previs. All of the sets and characters were built in Maya and available in MotionBuilder, so a lot of preparation work including planning and lighting could be achieved virtually by the cinematographer, Mauro Fiore, long before any set was constructed on stage in New Zealand. In fact, 18 months of shots were captured before the team went to shoot on stage.

When filmmakers work virtually, captured shots can be assembled and played back together to form a "digital prototype" of the sequence (or even of the movie). This prototype gives crucial visual context to inform important creative decisions, either editorial or cinematographic. Many ideas can be tried without requiring an expensive shooting crew; and more iterations mean better results. The decision-making moment can also be extended further downstream, which is great for creative freedom but might put a heavier burden on the post-production teams as it favors last-minute changes. 

These important improvements benefit actors, cinematographers and directors who receive better visual feedback to perfect their craft; and given the pace of improvement in CPUs and GPUs, we can expect to see this real time feedback reached more quickly, with a much higher level of fidelity. Already, the latest releases of our creative software applications show significant improvements in the speed and realism of interactive 3D viewports.

Believable Characters In Avatar, Zoe Saldana plays the heroine Neytiri. She introduces and warms us to the fantastic world of Pandora, and is key in getting us to side with Pandora's Na'vi natives.Her romance with Jake Sully (Sam Worthington) is at the heart of the story.Neytiri is a great traditional movie character, and Saldana delivers emotion-filled performance on par with the best from old-fashioned films that transports us into the story. This achievement represents another breakthrough from the wizards at Weta.

Back in 2001, Andy Serkis' performance gave life to Gollum in Lord of the Rings. Although motion capture had been around for some time, the prowess was that Serkis and Weta gave both his motions and emotions to the computer-generated character. With Avatar, Joe Letteri and his team pushed the boundaries and gave us a believable love story between relatable virtual characters. Technology contributed here too, as a new muscle system was implemented in Maya to better exploit the data coming from the capture, resulting in a unique level of realism for facial animations.

Avatar also crossed the Uncanny Valley while advancing "pure cinema."

After Davy Jones in Pirates of the Carribean: Dead Man's Chest from ILM in 2006 and The Curious Case of Benjamin Buttonlast year from Digital Domain, augmented performance is proving very successful in bridging the uncanny valley for digital doubles and human-like virtual characters. I'm looking forward to seeing an actor recognized for his or her performance on a virtual character like Gollum or Neytiri.

Immersive Experience Among its many "firsts," Avatar is arguably the first mainstream live-action movie to be made fully in stereoscopic 3-D (S3-D). Until now, most S3-D movies were stereo versions of CG animated movies. Creating a live-action S3-D movie is more difficult and more expensive, and there was doubt as to whether moviegoers would enjoy the immersive experience of S3-D for a 2.5hr-long movie. Avatar has thus proven that live-action S3-D movies are viable and profitable. Many of the companies involved in Avatar use Maya as a core tool in their pipeline and it has become a leading S3-D movie production tool as it features a very flexible stereo camera rig co-developed with DreamWorks.

Convergence of Films and Games Avatar the Game has been designed as an extension of the movie, providing another way to discover the Na'vi and their Pandora world. Players can fight alongside the RDA Corp. or choose to join the Na'vi, battling to protect Pandora.

The game and the movie have much in common. Ubisoft and Lightstorm worked together with a rather unique level of collaboration and trust. As both teams used similar tools, it was easy for them to share assets. Beyond that, the movie and the game also share the same creative vision and James Cameron remained involved to ensure this. Very early, the games team had access to concept art and early footage for the movie. Ubisoft proposed designs for creatures or vehicles that were approved by James Cameron and even sometimes integrated into the movie.

Ubisoft created an interactive immersive stereoscopic experience for Pandora, thus demonstrating the relevance and the value of S3-D for gaming, but we have yet to see at what pace S3-D screens are adopted by consumers.

Collaborative and Concurrent Workflows Weta Digital was the primary provider of visual effects and worked alongside 15 other studios that contributed to the movie: ILM (battle), Framestore (Hell's Gate shots), Prime Focus (bio lab, op center), Buf Compagnie (tunnel, earth shots), Hybride (link room) and many others. These companies brought the entire set of Autodesk's digital entertainment creation tools to bear to help bring the story to life.

Cameron's ability to make key creative decisions in the environment alters the role of directors, animators and visual effects houses.

The virtual cinematography process lends itself very well to concurrent workflows and a tighter communication loop between production and post production; a workflow very similar to the offline editing/online finishing model found in video production. Lightstorm and Weta traded both FBX files and reference movies to send the MotionBuilder shots over to Maya for finalization.

Toward Virtual Production It's been a long journey since the pseudo-pod in the Abyss in 1989 marked a major milestone for computer generated visual effects and the start of the onslaught of technology into the movie industry. The production of Avatar gives us another picture of how technology is liberating creativity. Directors can direct computer-generated and live actors in real time, in digital environments. The performance of actors can be augmented and/or transported to virtual characters. The availability of high quality, high fidelity prototypes for movies can help creative teams share their vision and have a holistic view of their work.

We on the development side have a lot more work ahead of us to get our software to the point where it is truly empowering to non-technical creatives to conceive and realize their visions. It is up to us to simplify and democratize these techniques for every filmmaker, especially for the future generation of directors that is just as comfortable behind the computer as they are behind the camera.

Marc Petit is SVP, Autodesk Media and Entertainment.

Tags 
randomness