Harry Potter and the Half-Blood Prince contains spectacular new fire and water sim from ILM, which Bill Desowitz uncovers along with how the eerie Inferi were animated.
View clips and trailers of Harry Potter and the Half-Blood Prince at "Harry Potter Haven" on AWNtv!
As part of director David Yates' mandate for a more naturalistic Harry Potter, Half-Blood Prince contains some extraordinarily good fire and water sim work by Industrial Light & Magic, which also animated the Inferi creatures inside the seaside cave, protecting Voldermort's Horcrux, the device that stores a portion of his soul and allows him to remain immortal.
In fact, for the first time in the franchise, ILM was tasked with creating the vfx for an entire sequence: the crucial one in which Harry (Daniel Radcliffe) and Dumbledore (Michael Gambon) travel to the seaside cave to retrieve the Horcrux in order to defeat the Dark Lord. Inside the cave, Harry is forced to make Dumbledore drink a foul, mind-altering liquid that hides the Horcrux. Though weakened by the liquid, Dumbledore defends them from a horde of Inferi by conjuring a fiery tornado, and returns them back to Hogwarts' Astronomy Tower.
"In the end, the most difficult and interesting work for us was the water and fire simulation," Alexander recalls. "We had three shots external to the cave where Harry and Dumbledore are looking back toward the entrance and that was a fully CG ocean. They actually shot a helicopter plate of it, but the water wasn't wild enough for David Yates. So they asked us to give them a much stormier ocean. And once we go into the cave, it's dominated by a giant lake and the island that Harry and Dumbledore are going to is in the middle of that lake. All of the water is computer generated, even the water that Harry scoops out of the bowl that holds the Horcrux.
"We used our proprietary water engine for the other work, but we hadn't experienced such a small scale before when dealing with the water in the bowl. We found that a lot of the settings within the engine had to be changed. The engine acted very differently and it was an interesting discovery to witness the differences between handling a cup of water vs. the Maelstrom or the Poseidon storm."
But it's the fire that required some new technology because of its scale and slithering quality. "We needed to turn it into a giant tornado, so we had to come up with a full CG solution for greater control," Alexander adds. "We started the process with two methods: brute force and another path led by [TD] Chris Horvath. And after a couple of months of development, we realized that Chris' [finesse] approach was going to give us the detail and the look that we needed to get a realistic-looking approach. The development of the process took about eight months and we had a fire crew working on it with four people with Chris leading that group.
"The fire is based on two simulations: an initial one that is very low particle count that is meant to be the fuel, as if you're shooting gas into the air and then are going to light it. This low count means the simulations can be turned around very quickly, with multiple iterations within an hour. Willi Geiger set those simulations up and they basically look like a tornado, spinning around.
"And once we have that particle simulation, it goes into Chris Horvath's tool, which is a hardware-accelerated approach, so it's using the GPU. We had a small GPU farm of 30 NVIDIA cards and the next phase takes the fuel and burns it and you get super high detail out of it. So it actually takes the simulation that says, "I'm fire, I'm burning, I have this much buoyancy, I have this much heat, I have this much smoke." So it has these parameters that are somewhat intuitive that you can set. We get the extra detail by running in two dimensions rather than three so it runs a lot faster, and you can do very high-resolution simulations because it's done in two dimensions. We basically slice it up into 2D slices and compile them together, so we're putting our time and money into screen space and not depth. We could turn this around in an afternoon. It's turned out to be pretty versatile and other shows are starting to pick it up."
Horvath and Geiger will present a technical paper on the "Directable, High-Resolution Simulation of Fire on the GPU" at SIGGRAPH 2009 in New Orleans, Aug. 4 in Hall E 3.
Not surprisingly, the Inferi were very tricky. According to Tim Burke, the overall visual effects supervisor, "David was pretty keen on giving them an unnerving presence but didn't want in any way to create a zombie type of character. He wanted a human character you could empathize with, that even evoked sadness. At the end of the day, these are dead souls. We came up with a design that went beyond what was possible with an emaciated human: very skinny with all the bones and ribcage showing and spending all the time underwater with the effects of gray skin. We developed this concept with ILM. They developed different variations of heads in ZBrush so we could populate this world with hundreds of these characters. And we started looking at how they moved and having done some studies of movement and getting to the point where we didn't have any reference material, we turned it over to ILM to work on their animation skills and they came up with pretty normal human type movement."
Chu acknowledges that they were given great direction as far as skin texture, and that Aaron McBride, the visual effects art director, was instrumental in coming up with different ideas to play with so that they would look as human as possible and you could have an emotional connection to them as victims.
"When they first appear, David wanted it to appear that the Inferi were not attacking Harry but welcoming him," Chu explains. "We explored a spider-like movement or walking slowly. We decided on a slow encroachment onto the island, since that was in keeping with [the slumbering state] they were in. We used Maya for animation and developed some new custom sim for the water that reacts to the Inferi that are above and below the water.
"We built about a dozen or so Inferi with different facial shapes and heads and we were able to create variations. We used our in-house tool, Zeno, to do the facial work, with lots of creature development work for the skin, muscle and hair."
When the Inferi emerge from underwater, Chu admits that it's going to be one of those, "OMG, what the hell: I thought it was bad above water," moments.
Bill Desowitz is senior editor of AWN and VFXWorld.