Lucasfilm CTO Cliff Plumer Talks Technology

Barbara Robertson speaks with Lucasfilm cto Cliff Plumer to find out more about the technology of the future from someone at the forefront of innovation.

Cliff Plumer. All images courtesy of Lucasfilms.

Given the opening of the Letterman Digital Arts Center in San Franciscos Presidio, the new home of Lucasfilm, Industrial Light & Magic and LucasArts, along with subsequent announcements such as HP becoming Lucasfilms preferred technology provider and ZBrush being added to the ILM pipeline, VFXWorld invited Barbara Robertson to dig a little deeper with Lucasfilm cto Cliff Plumer.

Barbara Robertson: Weve been hearing a lot about new software that you call Zeno. What does it do?

Cliff Plumer: At the core of Zeno is a scene file; thats our proprietary file. Weve built all our proprietary tools on top of that platform and integrated commercial tools within it.

In the past, the core of ILMs pipeline was based on Softimage, a commercial 3D application, and that created obstacles because once you have to translate between apps, it increases the complexity of the pipeline. Things didnt translate easily and wed be stuck in middle trying to get software programs to support each other. It took a lot of technical administration to manage files and move them between multiple applications.

Now, Zeno is the hub; it handles the file conversion. Thats all transparent to the artists.

If someone is working on a model in Maya, they can click on a button that says copy. And then go into Zeno and paste it. Its literally like a cut and paste function. Once in Zeno, we have full control over all the files. Some people might think that doesnt sound innovative, but in CG, thats huge. It makes everything more efficient. Its at the foundation of what we changed in the CG pipeline.

BR: What else changed in the pipeline?

CP: The other key with Zeno is that the tools are modules and we keep developing modules. With a large 3D application, you load all the controls at once, and thats a lot of overhead. With the Zeno architecture, the artists use only the tool needed for a task, whether theyre doing lighting, creature rigging or animation. These are all modules in Zeno; it adds a lot more efficiency. And, by having all those things integrated in Zeno, there is more collaboration among different types of artists.

BR: How does Zeno allow more collaboration?

CP: We have what we call asset management and production tracking tools integrated within Zeno. The pipe used to be an assembly line. X would hand the project to Y down the pipeline. If something changed, they had to update the pipeline through every single process manually. We used to have problems with renders if someone was not using the current version, it would break.

Now, artists can make changes and they are automatically updated throughout the pipeline. If Im an artist using a number of assets, I dont need a production coordinator to tell me the status of an asset. When an artist works on an asset, its automatically updated and everyone is using the current version. Instead of having an army of PAs with clipboards, everything is all online and automated. We can have artists collaborating on the same file in realtime.

The feature wars ended a long time ago. Now, its about workflow; about getting tools into artists hands that are integrated under one user interface.

BR: Was this made possible because of faster hardware?

CP: Well, were not dependant on particular hardware, but were designing what we want to do to take advantage of fast hardware.

In the past, we were limited by the bandwidth at the old ILM campus. That network was, in general, 100Mb to the desk and 1GB in the backbone. Weve scaled that up by a factor of 10. Every artist has one gig to the desk and 10GB is the backbone infrastructure.

BR: What difference does that make in the way artists work?

CP: Because we have more bandwidth, we can view high res images efficiently instead of relying on compression techniques. We want to move to a place where we can do more internal videoconferencing so that instead of artists having to meet in the theater, they can view images across the network from different locations. Instead of calling Dennis [Muren] and saying, Lets meet in the theater, they can make a phone call, sit at their own desks, say, Take a look at this, and both view images at the same time.

The other big thing in terms of flexibility is that we can move artists around more easily. If we have artists moving from one show to another, they can just pick up their things and go down to a new assignment, log on and their environment is set up. They can log on from any workstation in the facility and start working. They dont have to move their workstation with them.

When the concept of moving to the Presidio came up, we determined it would have taken two weeks; we would have had to close down. So, we came up with a plan to move the facility without any down time. Weve had a 10GB pipe between ILM in San Rafael and the Presidio and have been moving the back end infrastructure so there would be no down time. When artists leave for the weekend, they pack their personal belongings and when they show up on Monday morning, theyre working.

If the Letterman Digital Arts Center doesnt inspire creativity, nothing will.

BR: Does everyone get new machines at the Presidio?

CP: Theyre transitioning. Our three key vendors for the desktop are HP for workstations, AMD for processors and NVIDIA for graphics. Everyone is moving toward 64-bit workstations. Most have dual heads [monitors] today. They will all have dual processors. Were working closely with AMD, our processor vendor on dual core processors. Well get to the stage over the next year where artists will have a dual core dual processor two processors on one chip; like four CPUs. Plus, two graphics cards. Were looking at a workstation equivalent on the desktop thats more powerful that what we used to have with a fully blown SGI Onyx, something ILM once paid hundreds of thousands of dollars for.

Our render farm has about 3,000 processors and we have a proprietary tool that lets unused desktops become part of the render pool at night, so we can scale up to over 4,000 processors.

BR: What are some of the ways that will affect production?

CP: It gives artists working with supervisors more interactivity with high res images. Artists can get to a good first take faster. In the past, wed look at the production schedule and it would take a lot of time getting to that first take because we didnt have enough bandwidth, the tools couldnt handle the complexity of the scenes so they would be broken up into bits section of a scene or section of an asset. Now, we can load all that material into a scene and work interactively, get to the first take faster, and then working with a supervisor, spend more time tweaking to get quality. That happens just by being able to see everything at once.

Were using the GPU on the NVIDIA cards for preview rendering. Artists dont always have to wait for overnight renders to see a result when they can take advantage of hardware rendering.

Our interactive lighting tool Lux also takes advantage of hardware rendering. Lux is a big step forward in lighting. Our first tools focused on lighting an asset; our next generation was designed to light a shot, all the assets in a shot. Lux is designed to light a sequence, an entire scene. When artists can light a whole scene, they can make sequence lighting decisions rather than individual lighting decisions. TDs can work on the whole sequence rather than break it down into individual assets or individual shots.

BR: Now that all artists on the pipeline can access all the tools, are you seeing TDs using modules other than Lux?

CP: This also gets back to the creative process. Instead of trying to solve every problem with lighting and compositing tricks, the TDs have simple painting and touch up tools in Zeno. They can paint a highlight with a brush.

Even the previs innovations of the Star Wars prequels will soon be a thing of the past.

BR: What will be the impact of putting LucasArts and ILM on one pipeline?

CP: Now that were all under one roof, ILM can take advantage of their game engine, and LucasArts artists have access to things we take for granted. Look at something like crowd simulations. They have been big in effects for the last few years, but have been used in games for a long time. We can take advantage of their AI engines, their game engines and integrate them into the visual pipeline.

The big win for ILM, though, is in previsualization. A visual effects supervisor can sit with the director and have a synthetic scene move around in realtime. The director can block in a scene and do a camera move with a virtual camera. It feeds the whole post process.

BR: But that isnt new, is it?

CP: It hasnt been intuitive. Previs in the past has been a scaled down post- production operation. The director comes in, we make a change and show it to him. What were saying is, Lets make this like photography; do it in realtime. This is something weve been developing in conjunction with LucasArts to hand the previs to the director. Its almost like a game.

BR: Who has used it?

CP: It hasnt been used on a film thats been released, but its in use. Its also still under development. Ask me again in a couple months.

BR: So does this mean vfx artists will no longer do previs?

CP: The game engine part is designed to work in realtime. The director can plan how to shoot a live action or block a CG scene. Contained in the application are libraries of lenses and so forth. But, we can also record the camera moves, create basic animations and block in camera angles. And instead of handing rendered animatics to the CG pipeline, we have actual files camera files, scene layout files, actual assets that can feed into the pipeline. It gives the crew input into what the director is thinking.

BR: Was this something George Lucas used for Star Wars?

CP: No. It was driven by Star Wars. It was something George has felt strongly about. But, the tools werent ready for him.

BR: Can you see anyone other than directors and vfx supervisors using it?

CP: DPs might use it as well to place lights and see how a scene could be lit or shot. It isnt solely for directors.

BR: What impact will access to ILMs tools have on LucasArts?

CP: Well, weve literally taken their game engine and integrated it into Zeno. So, they have a bidirectional pipe that works in realtime. They can make editing changes in their engine or within Zeno tools. They can work on an asset in Zeno and export it to the engine and if they need to edit it, they can edit it within the game engine and it automatically updates in Zeno.

BR: Do you expect the game engine to be used on post-production?

CP: No. Only during early stages in previs. At this stage, the game engine cant hold as complex a scene as we need for film effects. To get the complexity we need, wed be compromising the realtime performance.

Many of the digital assets used on the Star Wars films were later utilized by LucasArts, which is increasingly becoming the norm with games based on films.

BR: What about sharing assets?

CP: In the case of Star Wars, we did that a bit. I can imagine where we might do that in the future. We might even have different versions of a digital double or synthetic character thats [otherwise] too high res to work in a game, or a background character that could be used as a hero character in a game. It might even affect how the film is shot. You might do things on set that could then be used as part of the game as well as create textures for the film. If you were going to repurpose those textures for the game, you would do things on set to make sure you have the right photos.

BR: Are there any other new modules in Zeno?

CP: Well, the Zenviro module has a huge application in visual effects. With it, artists can create environments using texturing tricks rather than fully rendered 3D geometry and also do level of detail for geometry. They can go from pieces of 3D geometry with a lot of detail to less detail in realtime based on where the camera is.

Also, weve begun doing image-based motion capture. I think the first use of the technology was on Minority Report and weve used it for some digital double type of work. But now were at the point that instead of relying on going onto a motion capture stage, a director can capture motion during first unit photography. Were using HD cameras, but the trick is in the photography.

BR: Would you sell Zeno?

CP: Ive been asked. People who have left ILM and gone on to other studios have inquired, but

BR: But youre not selling the family jewels?

CP: You could say that.

Barbara Robertson is an award-winning journalist specializing in visual effects and computer animation, and a travel writer. Her latest travel story appears in The Thong Also Rises, an anthology published by Travelers Tales.

Tags 
randomness