Search form

Rob Powers Talks VAD and More 'Avatar'

Read how the Virtual Art Department is one of industry's game changers as a result of Avatar.

Check out the Avatar trailer and clips at AWNtv!

Avatar VAD (Rainforest Gorge): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder. Images All images © Twentieth Century Fox.

With Avatar's Blu-ray and DVD release tomorrow from Twentieth Century Fox Home Ent., it's an opportune time to catch up with Rob Powers. He created and supervised the Virtual Art Department (VAD) for Avatar,and was part of the primary team that pioneered the new director-centric approach for shooting within a virtual, non-linear workspace. Powers and the VAD team used NewTek LightWave 3D as the main tool for developing the environmental assets and Powers recently joined NewTek as director of Entertainment and Media Development and discussed the creation and significance of VAD.

Bill Desowitz: Tell me about the crucial role of the Virtual Art Department on Avatar.

Rob Powers:

In creating the Virtual Art Department for this film, I saw that this technology, which Jim [Cameron] was able to push in a way that really leverages creativity, was also instrumental in the design. It was so immensely useful to the art department's production designers and art directors. For example, I was able to do a really quick mock up of the inside of Hometree, then go in there with an art director and walk around with a virtual camera, find an angle that he liked and grab it to image from a real world camera lens. That's the real catch here: it was basically taking film production into consideration in the design process. And we did hundreds of painting variations. So it actually fed the design of the Hometree and then we eventually ended up with a set that Jim was able to walk through. And art direction and production design was totally a part of it because we could go on virtual location scouts for Jim to go shoot on things and that was where these environments were created in this work space. So anything could be selected, anything could be moved or replaced. I worked really hard in the Virtual Art Department to organize things so that we could come up with a system that, if Jim didn't like a certain plant in an environment, he could swap it out with any plant in our library that he chose.

BD: What was something you uniquely created in VAD?

Avatar VAD (Hometree Ground Level): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

Avatar VAD (Hometree Ground Level): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

RP:

There were some places where they'd be doing stunt capture or Jim would be doing some kind of capture and there wouldn't be an environment created yet. So we were faced with a capture file without an environment, and I came up with a system called Character Emanation or snowshoes where the characters' motions would actually generate their environment.

BD: How was this achieved?

RP:

Using a particle system that emanated from our library as they moved through the space, they would generate the environment themselves. There was one scene where Zoe [Saldana's] footsteps created the branch from the tree as a result of her movements. So Zoe's performance actually built the environment. I don't think this has ever happened in the history of film that I know of. So once her body movement created the branch, I baked it out with the branch system as a complete asset and it went all the way to Weta.

BD: You worked on Tintin for Spielberg after Avatar. What was that experience like?

Avatar VAD (Hallelujah Mountains): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

Avatar VAD (Hallelujah Mountains): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

RP:

It was very different. First of all, most of the sets on Tintin were all architectural-based, so they would be realtime representations of a room, which is a lot easier than organic jungles and plants and the transparency channel that they had. But more importantly, I think, every director implements and uses the technology in a different way because they work differently. That was probably the major difference. Steven took really well to the virtual camera and really enjoyed that. But his style is quite a bit different from Jim's. Jim is extremely meticulous hands-on in every detail of production design -- everything. But there was a larger part of Tintin, which is done for Peter Jackson at Weta. It's Peter's style and his way of doing things and he's different as well. The first week when we were shooting here in Los Angeles, Bob Zemeckis came over to see it. As far as I know, I don't think he ever came over to Jim's set when I was there. By he was shooting right next door doing A Christmas Carol. It was very interesting to hear them talk on set. Talk about differences in workflow.

BD: How did your workflow work on Avatar?

RP: We would start with at least one matte painting of an environment from one angle. It was enough of a challenge to turn that into a 360 of an environment. That was usually the workflow. But sometimes we figured it was easier to create it in this environment. There was one particular environment (Jungle/Biolume) that was laid out in its birth in the Virtual Art Department: it never existed as a painting. And I think that's fascinating because it won the VES award for best environment. I basically laid this out as a first pass, sent it into the volume; Jim walked through it, did his production design and changed a few things. But because those anemenoids were there, when they went to shoot, it was something they could respond off of. It wasn't in the script and I think Sam Worthington saw it and made suggestions [about touching them] and Jim loved it and it's in the movie. What that environment represented for me was two things: that it served as a legitimate design space from inception and that when you basically have the information for the actor and director to play off of, it showed how it made the scene better, and that is what is so amazing about Avatar. It was a living, immersive moment. I think this is a big shift for the positive. It's about making that final experience fantastic. And I was on the set when Zoe did two of her major performances where she really emoted so much on stage that you could hear a pin drop. And then I saw the movie it was the same performance and it affected me the same way.

Avatar VAD (Burial): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

Avatar VAD (Burial): virtual environment modeled and textured in LightWave 3D and displayed in a realtime Open GL pipeline using MotionBuilder.

BD:

But how is this now going to change the industry?

RP:

It is scalable. I think what people should know is that there was an awful lot of work that occurred on Avatar before the greenlight and off the stage. I had a very small system in my room that was accessible at all times to anybody that needed to go into that virtual space and not have the whole stage. So the technology is scalable from large motion capture stage, large volume down to much smaller systems. It's directly relevant. I think it's going to be more relevant to television productions as we move forward to independent films, smaller films, because once they realize how much time and money they save when they go through the process in this virtual workspace, you get things out before. That's what Rick Carter was able to do with all the live-action sets that he built in New Zealand. Every live-action set had a counterpart in the Virtual Art Department. He was able to walk through and look at those and make determinations and adjustments before it was ever considered to be built, and so it's really a place for people to vet out their thinking and their design process and to do it in a production-relevant way, which means with real world cameras in realtime. And the way the interface works is you don't have to worry about computer software programs -- you can come in and do a virtual walk through with a controller in your hand and move it. So the faster you get into this environment, the fewer problems you have. And that's all this really is.

BD: So what's it like being at NewTek?

RP:

NewTek is fascinating because I've never worked in development before. The thing that struck me is that their company has always made cutting edge products that are affordable all the way through the TriCaster, which takes the concept of the whole studio production van and put it in a little box in HD; the same thing with LightWave. So enabling the artists through technology is what attracted me to the company. And when I looked at what I was able to contribute on Avatar, it was essentially the system that was put in place for Jim to enable the artists. And it was the technology that was put in place that allowed something that wasn't possible before. So it's the same thing that I feel about the technology at NewTek, which is why I'm so excited: the realtime stuff and the interactive rendering technology and some of the interchange stuff in an industry-standard workflow. There were very specific reasons that we chose to model most of the assets in the environments in LightWave. When we started in 2005, the fbx plug-in support for LightWave was much stronger than many other applications. It's a great workflow…and I needed a workflow that got me there from A to B with no questions.

Bill Desowitz is senior editor of AWN & VFXWorld.

Tags 
randomness