Search form

'Harry Potter and the Goblet of Fire': Part 2 — Wizard Competitions, Deatheaters and Voldemort

In the final part of his in-depth report on the fourth Harry Potter blockbuster, Alain Bielik discusses several of the eye-opening vfx performed by various companies.

The Goblet fire-like effects were created in CG by Rising Sun Pictures while the dragon sequence was produced by ILM. All images courtesy of Warner Bros. Pictures. © 2005 Warner Bros. Ent. Inc.  Harry Potter publishing rights © J.K.R.

Once the Goblet of Fire sets into action, things start to turn bad for Harry Potter, as he is selected to participate in the Triwizard Tournament. The Goblet fire-like effects were created in CG by Rising Sun Pictures, which created more than 50 shots. Using Maya Fluids, we first built up a library of different flames that looked cool, visual effects supervisor Tony Clark says. We then defined a basic library of 3D elements and applied the different looks to a nominated master shot the one where the Weasley twins are thrown out of the Age Line. This shot contained around 40 different fire elements. Since those were created in 3D Voxel space, they consumed a large amount of memory. As we grew the size of the elements, we found that the level of detail diminished dramatically and we would loose the nice little ragged bits around the edges of the flames. Digital effects supervisor Will Gammon had to pour in all sorts of 2D techniques in order to build up the level of detail by combining multiple elements, noise passes, turbulence and such. The scenes were rendered either in Maya for the fire elements or in 3D Lite for the Age Line elements.

First Task: Tame the Dragon

Four contenders are chosen by the Goblet of Fire, including Harry, and the action moves on to the first of their three tasks: retrieve a golden egg from a nest guarded by a fierce dragon. Comprising 140 shots, the sequence was awarded to Industrial Light and Magic (ILM), with Cinesite creating 16 shots featuring different scale-models of the arena. For the dragon, we started with a cyberscan of a quarter-scale maquette provided by the production art department, recalls visual effects supervisor Tim Alexander. This formed the basis of our CG model. The textures were still photographs of the maquette projected onto the model and associated with additional maps. At one point, we found out that we couldnt get the exact right shape on the tail using displacement maps alone. So, we went back and sculpted extra details on the model. Altogether, the paint job lasted about 12 weeks.

On set, the egg was filmed as a practical glass prop, but Animal Logic animated 3D glass bubbles and lit them with a realistic glass shader, providing an ethereal look.

The fire emitted by the creature was created via CG animation with practical elements integrated into it. We spent months of R&D on the fire, using volumetric rendering and particle animation to create it, but, still, there were shots in which something wasnt quite right about it. We knew something was missing even though we couldnt put our finger on it. It could be too furry, too particly So, we ended up shooting practical fire elements and mixing them in with the animation. Throughout the sequence, Harry Potter is either a bluescreen element of Daniel Radcliffe, or a CG double, the latter appearing sometimes full screen. As for the background, it was a combination of CG environment (the lower part of the arena) and live-action plates (the spectators in the stands).

And the winner of this first task is surprise! Harry Potter. The young contender takes the egg to Hogwarts and tries to figure out its meaning, as the artifact is actually a clue leading to the second task. On set, the egg was filmed as a practical glass prop, but director Mike Newell wanted it to look more ethereal than could be achieved in camera. To this purpose, overall visual effects supervisor Jim Mitchell commissioned Animal Logic and in-house supervisor Kirsty Millar. Daniel Marum, 3D lead, came up with a basic approach for the effect: I animated 3D glass bubbles and lit them with a realistic glass shader. During the R&D stage, we were testing physically-based light simulations to generate images of volumetric caustics. The drawback was that the render times were astronomical because they were a physically-based simulation of light traveling through a number of complex volumes. To solve this, I wrote a tool in MEL that built strips of geometry from the surface of the ribbons. The strips were rigged to be easily animated, and then shaded to look like rays of light. Most of the animation was carried out in Maya, rendered out in mental ray and composited in Inferno or Shake.

Framestore CFC produced more than 200 shots, including the arrival of the Beauxbatons Academy members at Hogwarts in a giant carriage pulled by flying horses. Framestore re-used the wings of its hippogriff CG model to build the Pegasus models.

Second Task: Swimming with Mermaids

Harry finds out that his next task consists of retrieving a friend from the bottom of the Hogwarts lake. The huge underwater sequence was realized at Framestore CFC under the supervision of Tim Webber. The London-based company produced more than 200 shots, including the grand arrival of the Beauxbatons Academy members at Hogwarts in a giant carriage pulled by flying horses: Since the sequence comprised only six or seven shots, we didnt have the time for any R&D, Webber notes. We re-used part of our hippogriff CG model, the wings in particular, to build the Pegasus models. Framestore created all its shots using Maya for modeling and animation, RenderMan and proprietary tools for render and Shake and Inferno for compositing.

The underwater sequence presented multiple challenges. Framestore first had to create a completely believable CG environment made of jagged rocks, weird plants and murky water this is not a tourist destination and populate it with two different types of creatures: the mermaids and the grindylows. The underwater environment was made as murky and scary as possible, except for long shots in which we had to see more, Webber explains. We had an awful lot of layers: particles, pieces floating around, rays of lights created with volumetric renders One of the major challenges was to generate the plants and the leaves. In some shots, we had several thousand leaves floating around. In order to generate these shots, we developed a new code, derived from our hair system, to control the movement of the leaves with currents and turbulences. It enabled us to create the animation automatically, while still keeping control of how things moved. For shots in which Harry or the mermaids swim among the plants, we generated the interactions by using collision objects.

The sequence features underwater footage of Radcliffe shot in a greenscreen tank, while the more extreme movements were animated with a CG double. The mermaids and the grindylows were digital characters created by CG supervisor David Lomaxs 3D department. Both creatures presented a similar gelatinous, translucent look, which was worked out in specific shaders. This aspect of the creatures was tricky to pull off as something slimy no longer looks slimy underwater, Webber remarks. The mermaids bodies were hand animated, but the hair and the fins were animated procedurally. We added a lot of control tools to make sure the hair didnt flatten whenever a mermaid was swimming fast. It included generating extra turbulence to keep it looking snake-like and sinuous. We met a great challenge with the grindylows, as these octopus-like creatures come in swarms. For shots in which hundreds of them chase Harry, we developed a piece of software called Choreographer that allowed us to use hand animation while manipulating huge numbers of models. It removes the reliance on crowd simulation software for large quantities of creatures, and keeps absolute control of everything.

Framestore created a CG environment made of jagged rocks, weird plants and murky water. The sequence features underwater footage of Radcliffe shot in a greenscreen tank, while the more extreme movements were animated with a CG double.

Third Task: Walking the Maze

Harry makes it out of the lake alive and begins to focus on his third and final task: traverse a dangerous maze and find the Triwizard Cup. The maze sequence was assigned to The Moving Picture Co. (MPC), with visual effects supervisor Ben Shepherd overseeing some 216 shots. The live-action plates were shot on a maze set with mechanical plants rigged by special effects coordinator John Richardson. We ended up replacing all the moving practical hedges we were given, mostly to have more control, CG supervisor Nicolas Aithadi explains. At the end, we were receiving shots with just the kids on bluescreen. We had to recreate everything behind and in front of them. The many challenges of the sequence included creating hundreds of thousands of CG leaves in a manageable way. For every shot, we would first hand-animate the hedge corridors, then each hedges trunk and then the branches, a task supervised by Ferran Domenech. The last touch was given by Cantilever, a piece of software that took care of the secondary dynamic animations, down to the sprigs. It was initially developed for the Whomping Willow sequence in Harry Potter and the Chamber of Secrets.

The mermaids and the grindylows are digital characters created by Framestore. Both creatures presented a similar gelatinous look, which was worked out in specific shaders. The challenge was to make something look slimy underwater.

Some of the widest shots were created under particle systems to represent the hedges in the distance, using impressionistic techniques to depict the leaves. In the most complicated shots, the camera traveled from inside the maze to high up in the sky overlooking the maze in its full glory. We had to start the shot with full-on modeled hedges and, in the course of the animation, blend them with different level of details to end up in the far distance with matte paintings. We had the high resolution, hand animation friendly hedges for the extreme foreground. Then, we created a system using eight layers of procedurally animated hedges with eight layers of opacity mapping to create depth. For the mid-ground, we would use displacement maps on low resolution hedges and, finally, for the extreme background, we used a matte painting or particles. MPC relied on Maya, Cyslice and ZBrush for modeling, SyFlex for cloth simulation, RenderMan for rendering and Shake for compositing.

Voldemort sports an impossibly flat nose, an effect realized by MPC in some 90 shots. Tracking markers, the camera and Fiennes basic head movements were match-moved or roto-animated with a 3D head model.

A Face-Off with Voldemort

Once the Triwizard Tournament is over, Harry faces an even more perilous challenge: meet Voldemort (Ralph Fiennes) in person. The climactic sequence takes place in a graveyard and features visual effects realized by MPC and Double Negative. The two facilities worked together to develop a consistent look to the numerous set extensions. MPC produced a hero sky panorama, which Double Negative used as an element in its own shots. During the sequence, Harry is held captive by a living statue created by Double Negative. Using Maya, 3D artist Pawel Grochola had to solve the problem of animating a solid bronze statue without deforming its body to the extent where it lost its rigid metallic quality and took on the properties of flexible rubber. The solution was to define points along the body to hinge the limbs, avoiding as much movement in the cloth as possible.

The action then focuses on Voldermort retrieving his human form in a cauldron. The surface of the eerie liquid was created by Double Negative using complex fluid simulations augmented with practical smoke elements. MPC took over for the shot where Voldemort grows from and out of the cauldron. We divided the shot into separate elements: the cauldron, the cauldron smoke, the cloak smoke, the cloak, the fire, the slime, the drips and Voldemort, Aithadi relates. Voldemort alone was quite complex as the effect included skin, wet skin, bones, organs and muscles. We used a cyberscan of Fiennes as a base to model the baby Voldemort. Once we had the character in the start shape and end shape, we rigged them both using a scalable rig, as we not only had to animate the movement, but the scale change as well. To this purpose, we extracted shapes throughout the animation and re-sculpted them. Finally, we recreated an animation based on clean geometry. We also used a proprietary plug-in for Maya to animate the muscles rolling under the skin. The cloak was created in SyFlex cloth simulator while the fire was generated as a Maya fluid. Every single piece that was rendered had different layers controlling every aspect of the shading or more fancy things like the temperature of the fire or the density of the smoke. The wide shot of the transformation contained around 13 elements and each of them had between five and 20 layers. It was an overwhelming amount of data to deal with for compositing supervisor Charlie Henley and his 2D artists.

From this point on, Voldemort is played by Fiennes sporting an impossibly flat nose, an effect realized by MPC in some 90 shots. Tracking markers, the camera and Fiennes basic head movements were match-moved or roto-animated with a 3D head model. We spent a month developing a skin shader and ended up using as many as 15 different textures for the nose from epidermis and dermis to veins to sweaty specular Aithadi continues. We also used animated displacement to generate wrinkles and fat movement under the skin. Even though we were replacing a small portion of Fiennes face (from below the eyes to above the mouth), we rendered all the face. We wanted to make sure that we had enough surface to blend with the real skin.

Double Negative visualized the arrival of the Deatheaters and also the return of the Dark Mark, which takes the shape of a snake.

Voldemorts skin was rendered out using sub-surface scattering. The sequence comes to an end with the return of the Dark Mark and the arrival of the Deatheaters, Voldemorts creepy minions. Double Negative visualized the Dark Mark as a huge cloudy form in the sky. Taking the shape of a snake, the cloud was completed as a combination of geometry, fluid simulation and particle solutions, as it had to lunge toward the camera to reveal the Deatheaters. Using our proprietary fluid software, DNB, we designed the smoke to twist and vortex towards the ground and, on impact, pull in a reverse direction to assume the mass of the Deatheater, visual effects producer Dom Sidoli observes. Blending techniques were then designed in 3D and 2D to make the final transition between the CG renders of the Deatheater character, the fluid simulations and, in some cases, the actor element.

Its a Kind of Magic

For all the artists, working on a Harry Potter movie is always a special assignment especially with the vfx bar raised every time. I think theres an audience thats been growing up with the Potter films, concludes Animal Logics Millar. And the visuals just keep getting more sophisticated in order to match audience expectations with each new episode. To me, aside from all the hugely visual effects intensive sequences in the film, its all the subtle effects, like the Egg, that really bring the story to life so successfully. Its those added little touches that make the Potter films so magical, if youll pardon the pun.

Alain Bielik is the founder and special effects editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications and occasionally to Cinéfex. He recently organized a major effects exhibition at the Musée International de la Miniature in Lyon, France.

Tags