Search form

Introducing Plume for Firebending

Read how ILM raised the bar for GPU acceleration on The Last Airbender.

Check out The Last Airbender trailers at AWNtv!

With Plume, all of the simulation and rendering was done with the latest NVIDIA GPU technology. All images courtesy of Paramount Pictures.

For M. Night Shyamalan's big-screen adaptation of The Last Airbender, Industrial Light & Magic was crucially tasked with simulating the signature bending of air, water, earth and fire. According to Pablo Helman, ILM's visual effects supervisor," We had to basically design and figure out what bending is… And it was a big discovery that we took together as a journey… But I have to say that Night was able to work with the process really well and was able to make decisions on what he would see in animation about what it was going to be."

However, Shyamalan was adamant that the elements look as naturalistic as possible. Water, as always, proved difficult because of its complicated nature, and Zero G water tests from NASA proved instrumental in figuring out how to convey the flow of water that the director requested. Air was a design challenge because of its abstract nature, so it was decided to use the available environment for a wispy look. Earth was not so difficult because it was only used in a couple of scenes and could be attained through the use of Fracture (Star Trek, Indiana Jones and the Kingdom of the Crystal Skull).

But fire was the most challenging, not the least of which because the director didn't like the look of real and CG fire alike. So ILM had to come up with something new to generate fully-directable photorealistic fire. Plum is unique in that it not only serves as a fully volumetric simulation engine but also as a renderer.

Taking advantage of the latest NVIDIA GPUs (the 5800s), Plume gave the vfx team an eight to 10-fold speed increase in generating simulations and renders for hero, middle-ground and background fire effects. This meant that artists doing complex simulations that had previously taken eight hours (overnight) to compute a single iteration could now get six to eight iterations a day -- fully rendered -- thus providing great artistic flexibility and shortening the time it takes to complete shots substantially. (You can check out the SIGGRAPH Talk about Plume: July 25, 3:45 pm, Rm 515 AB, Los Angeles Convention Center).

Plume gave ILM fully three-dimensional fire without having to save the simulation data to disc.

"We had a hybrid fire solution from Potter and a pretty good one (2D slices layered in depth), but it wasn't built for the kind of camera moves that we had," explains Craig Hammack, ILM's associate visual effects supervisor. "So we redeveloped the fire to be a full three-dimensional solution. In Potter, we got crunchy, crisp fire licks. Through Plume, we got more of a billowing look to the fire, so it could basically flow at the camera and we could orbit the camera around it. Because we ramp in and out of high-speed camera looks as we come around for some of these effects, it was important to have a three-dimensional look."

It was anchored in Zeno simply for the sources (primitive or as particles). "Typically, we would run a very low particle simulation, which then drives the fluid and weaves particles as fuel or temperature or velocities," Hammack continues.

Once these sources get placed into Plume, all of the simulation and rendering was done on the GPU. "We chose to build in the volume renderer, the Ray Marcher, so we could tie it more closely with the sim work," Hammack adds. "We wouldn't have to judge simulation data. Almost every run of simulation gets rendered at the same time and gets judged as full renders, which cuts down on the supervisor interpretation phase of it, so they don't have to extrapolate what it's going to look like. Another thing we discovered was that the sim on the GPU runs at interactive speeds, but saving all the grid cells and data creates a bottleneck. We quickly put in options where the simulation data is never saved; it's kept in the GPU so you don't have to deal with the I/O of getting that data written and read from the file formats. That's another reason why we built the renderer in."

The software proved so flexible, in fact, that it was also used to generate air clouds, smoke and airbending effects.

"We did experimentations of doing a contrail look where it pulls vapor out of the air and distortion looks where it warps light as he's bending the air," Hammack suggests. "Night shied away from all that as being too fantastic and too difficult for an audience to understand. But pulling it from his own environment has its own challenges because he's not always in dusty or snowy environments. You have play like he's pulling dust out of cracks of things. It still leaves you with multiple looks with the air to deal with.

Plume is not ready to tackle water simulation, yet ILM invested technology in the directability of water and applied animation on top of the simulation.

"The only addition for the use of fracturing technology was we chose to use Plume again for all the dust work. And that was a big win for us when we had interaction between the elements because the fire could then affect the dust as true simulation -- and the same with air and fire.

"For water, we invested a little bit of technology in the directability of it. With multiple scales of water and very long, good looks, we had to refine those tools and development in the look. For a water ball coming out of the ocean and floating right in front of you, there's so much complexity that goes into the light interaction or surface or interior detail or tightness of the refraction or if it's motion blurred. It's a bit of a puzzle to sort out all the different water looks. In some cases, we went with RenderMan and in other cases mental ray. Whenever it made sense, we tried to go through an animation stage first for all of the elements as a visualization, timing stage that we could get in front of Night and get some buy offs. It's always like an exploration of failure before you get to the right point for each one. For the water, in some cases, it was a tentacle shape that the animators could drag around that we used for timing."

And how close are they to using Plume for water?

"

GPU acceleration is not quite ready for water but we're working on it," Hammack offers. "With fire, you don't really have to worry about mass and momentum. It's very gaseous and you can simplify your simulations. With water, you have to worry about the surface and mass and momentum. And with the limited memory of the GPU, you have to think about how you can represent all that data and fit it all onto the GPU card to get enough resolution to make it worth our while. You can quickly dump data back and forth between the GPU and the CPU without having to write it, but for something as complex as dealing with water surfaces and the effects that come off of it, you're going to want some stages in there of saved data."

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.