Search form

'Harry Potter and the Goblet of Fire': Part 1 — Hogwarts, Magic Spells and Miscellaneous Wizardry

In the first part of his in-depth report on the fourth Harry Potter blockbuster, Alain Bielik discusses several of the eye-opening vfx performed by various companies.

Harry Potter and friends magically travel to the Quidditch World Cup courtesy of Double Negative. All images courtesy of Warner Bros. Pictures. © 2005 Warner Bros. Ent. Inc. — Harry Potter publishing rights © J.K.R.

Since his very first year at Hogwarts School of Witchcraft and Wizardry, Harry Potter has vanquished a giant troll, arch nemesis Voldemort, Aragog the spider, the mighty Basilisc, a deadly werewolf and the terrifying Dementors. And he also dealt with that pesky little elf, Dobby. Yet, in Warner Bros.’ Harry Potter and the Goblet of Fire, Harry faces his most dangerous challenge to date: represent his school in the Triwizard Tournament, a competition in which the players participate in a series of hazardous contests…

Since Mike Newell (Four Weddings and a Funeral) is a director with little experience in vfx, the task of guiding him through the arcane world of plug-ins and shaders fell to overall visual effects supervisor Jim Mitchell, a seasoned veteran of the saga. The effects load was spread among nine vendors in four different countries, which could have created serious organizational problems. In order to review the work, Mitchell and visual effects producer Theresa Corrao used a variety remote review tools provided by vendors. The London-based production visual effects unit could thus keep track of the hundreds of shots that were being completed in locations as distant as Australia or California.

Concepts for the frenetic portkey effect included wormholes and inverted tornadoes. Double Negative decided the environment would vortex into a giant tornado and leave the characters hanging in mid-air, as their new destination forms around them.

Vortex Transportation Early in the movie, Harry Potter magically travels with the Weasleys to the Quidditch World Cup by using a “portkey.” Lead vendor Double Negative, which produced more than 500 shots, tackled the sequence, lead by visual effects supervisor Mark Michaels and CG supervisor Richard Clarke. Early concepts for the frenetic portkey effect included wormholes and inverted tornadoes until the artists arrived at the idea that the environment would vortex into a giant tornado, leaving the characters hanging in mid-air as their new destination takes form around them.

“We first set about isolating one ‘hero’ shot on which the surrounding shots would be based,” explains Double Negative’s visual effects producer Dom Sidoli. “In this master shot, the camera rotated 360º around the characters while the environment transformed around them. A giant panoramic matte painting was created using our proprietary stitching software, Stig. This gave us a 360º cylindrical texture for the distant fields and sky detail. Field and grass textures were painted for the close-up hill with CG grass rendered to cover the hill. The hill and surrounding landscape were then converted to particle systems, or softbodies, which were simulated to create the vortex effect. This was then turned into a geometry cache, so we could add finer detail and deformations to achieve the twisting tornado-like effect. Further layers of debris, leaves and clouds were rendered as particles to add volume to the environment. Finally, the composited shot was re-timed using Kronos, which created a more visually abstract result.” Depending on the shot, the characters were either the real actors shot on wires in front of a greenscreen, or digital doubles built from cyberscans with clothes animated via a cloth simulator.

The Quidditch World Cup From the moment the characters arrive at the World Cup venue, the sequence features visual effects provided by Industrial Light & Magic (ILM), which created 250 shots for the movie. Since each of the three previous installments of the series had featured a Quidditch match, it was decided early on that a fourth one was not necessary. This time around, the sequence would focus on the spectators before the start of a game. ILM’s brief for the sequence was to create a giant stadium filled with 80,000 digital characters. ILM first built a CG stadium, based on artwork provided by the production. Then, the CG department, led by Doug Smythe and Robert Weaver, set out to populate it with a convincing crowd. The spectators were created from one single model with 20 different built-in outfits. For each digital character, the crowd system randomly chose a selection of the various parts, and at render time, the pieces of outfit that were not selected for the shot were turned off. For textures, ILM used still photographs of extras in costume that were mapped onto the models.

Meanwhile, another unit developed the approach that would eventually bring this crowd to life. “We knew we needed a system that was going be pretty much automated,” recounts in-house visual effects supervisor Tim Alexander. “We wanted the spectators to be able to do certain things, like turn their head when a player flew by. In order to do that, we motion captured four people performing all the possible actions of a spectator before and during a match. The usual way to create a cycle is to capture short specific actions and then use animation blending to get the transitions. The problem with this approach is that you need to have good transition points between cycles, which can be tricky on large crowd simulations. On this project, rather than trying to blend multiple pieces of motion capture in and out of each other, we just shot continuous actions. So, in every shot, each person in the stadium is actually one continuous motion capture. This approach saved us a lot of time.”

Plug-ins were written within Maya to populate the stadium and pick up the cycles that were needed for each shot. RenderMan managed to render out the very big scenes in a remarkably short eight hours. “For those shots, we used what we call rib-archiving,” Alexander adds. “We didn’t load the whole stadium and crowd into memory; we only loaded the parts that were visible from the camera.” The crowd simulation was then combined — using Shake and in-house Comp Time — with bluescreen live-action elements for the foreground and mid-ground spectators.

Double Negative built a CG eye and replaced the practical eye in more than 50 shots of professor Mad Eye Moody.

ILM also tackled several isolated shots throughout the movie, including a lengthy shot of an old ship rising up from the bottom of the lake and sailing out to Hogwarts. “This shot was incredibly difficult,” Alexander admits. “We had one person on it for five or six months, and up to 10 artists at the end. The problem was that we needed to have CG water interact with CG sails. In other words, it required a fluid simulation engine to run and interact with a cloth simulation engine, which had never been done before! We actually used a new simulation engine based on a code developed at Stanford University. It has now been integrated into our proprietary Zeno package.”

Eye-Opener Effects Back at Hogwarts, Harry Potter participates in the first Dark Arts lesson of professor Mad Eye Moody (Brendan Gleeson), a character bearing an artificial eye. A large portion of the Moody shots was realized in-camera with make-up effects provided by Nick Dudman. However, for shots where the performance or look required enhancement, Double Negative built a CG eye and replaced the practical eye in more than 50 shots. “This required us to work without the use of tracking markers, instead using plate detail for camera tracking using our proprietary software, Photofit,” Sidoli notes. Double Negative then focused on the creation of an unfortunate spider that Moody kills using a powerful spell. The artists started by adopting a pet spider to be used as a reference for look and behavior. “The first level of modeling included all the major details, such as body, limbs, fangs, eyes, and was carried out in Maya,” Sidoli continues. “Then, a second level of detail, such as the small bumps and undulations on the surface of the spider’s endoskeleton, were added in ZBrush. This enabled us to interactively sculpt the high resolution detail required and still keep the mesh throughout the pipeline very economical.”

The sequence was shot clean without any spider placeholders in frame, but with additional maquette, lighting and tracking reference passes. The real spider was also photographed under similar lighting conditions, which provided vital information as to the translucent qualities of its body. This led to the development of subsurface and translucency shaders on the CG model. The 3D layers were then integrated into the classroom plates using the real spider as a look reference.

Double Negative’s artists adopted a pet spider to use as a reference for look and behavior. The first level of modeling was done in Maya while a second level of detail was added in ZBrush.

Moody is also at the center of another effects scene in which he loses his temper with Draco Malfoy (Tom Felton), transforming him into a ferret that is tossed around in mid air. The scene was realized by Cinesite, with visual effects supervisor Simon Stanley-Clamp and CG supervisor Ivor Middleton coordinating the project. “The CG ferret had to match the practical ferret featured in the sequence. It was modeled in Maya, lit in KIK and rendered out in RenderMan,” Stanley-Clamp explains. “The transformation is initiated when Malfoy is sent into a spin, an effect realized through a combination of particles and practical elements manipulated in Inferno. We also shot Tom and the ferret on motor driven turntables against a bluescreen on stage. The plates were then re-sped and repositioned to match the original shoot.”

The scene was part of Cinesite’s 230 shots assignment for the movie, which included a major pull back on Harry leaving Hogwarts via the bridge and making his way up to the owlery tower. The shot combined motion control plates of 10th, 12th and 24th scale model elements, a live-action motion control plate of Harry and another character, a CG mountain environment, CG trees, numerous practical snow elements and a CG owl. Cinesite also produced the many miniature elements for the movie, a massive task supervised by Jose Granell. The plates were then used as background elements, either for exterior Hogwarts environments or for interior sets, such as the grand marble staircase in the ballroom.

This sequence was shot clean without any spider placeholders, but with additional maquette, lighting and tracking reference passes. The real spider was also photographed under similar lighting conditions.

Transformations Later on in the movie, Moody is back at center stage as he transforms himself into another character. The 10 challenging shots were awarded to Buf Compagnie, Paris. “In the initial concept, Moody peeled his skin away to reveal the other character hidden underneath, but this proved to be too gory,” visual effects supervisor Stéphane Ceretti notes. “After a lot of R&D, we finally opted for a more traditional 3D morphing effect — at least for the close-ups — in which we manipulated the image to show the skin loosening up all over the face. Whenever the distortion pushed the plate information too far, we rendered CG skin to help maintain the resolution. The facial animation was first keyframed and then augmented with CG dynamics, such as gravity and other features. For the main transformation shot, the two actors were photographed on the same set, without bluescreen, but with tracking markers. Since the body shapes were so different, we ended up having to rebuild the characters in 3D, though, as to have a better control on the transition. The models were created using photogrammetry techniques. All our shots were accomplished with proprietary software.”

The Orphanage created 36 shots. Associate visual effects supervisor Kevin Baillie explains that the vfx house was responsible for Dumbledore’s memory threads and the pensieve bowl.

Buf also tackled a scene in which Sirius’ face (Gary Oldman) appears to Harry in a fireplace, with the logs magically shaping out his features and talking to Harry. Initially, the production had thought that the effect could be realized by combining a bluescreen plate of Oldman with a real fire. However, Mitchell favored a fully CG approach that allowed for a real camera move. “We knew we had to build CG logs and, somehow, to use Gary’s performance to drive their animation,” Ceretti comments. “This was no easy task as the margin between a magical effect and a clumsy animation was rather thin… To start with, we filmed Gary performing the scene in front of five synchronized cameras positioned at 180° around him. We then tracked his facial movements with our proprietary motion capture software, and applied this data to our CG logs. The textures were a careful blend of wood elements and Gary’s face. The animation was then augmented with ashes and particles falling down from the moving logs. Finally, we rendered out the shots in mental ray.”

A Bowl of Memories Meanwhile, showcasing a new trick from his bag of magic spells, Professor Dumbledore (Michael Gambon) uses a “pensieve” to re-live an old memory and search for a clue he had missed the first time. The two sequences were realized by The Orphanage, which created 36 shots under Jonathan Rothbart’s supervision. “There were really two breeds of shots that we worked on — one showcased the memory threads as Dumbledore extracts them from his temple, and the other focused on the pensieve bowl itself,” explains associate visual effects supervisor Kevin Baillie. “The memory threads were achieved by creating a tight grouping of extruded tubes around a central ‘control curve.’ The secondary thread motion was controlled by noise functions and a few sets of rules constraining the extents of their ‘poofiness.’ That grouping was rendered in mental ray for Maya, and supporting passes, such as a subtle reflective goo surrounding the thread grouping, was created in 3ds Max and rendered using SplutterFish’s Brazil r/s.”

The pensieve bowl itself consisted of a huge number of elements created and rendered in 3ds Max using Brazil r/s. “For the liquid surface motion, we simply used the built-in Reactor spring mesh solver that ships with 3ds Max. It was one of the instances where the simplest, most elegant solution ended up looking the best. The threads inside the bowl were animated in Maya and transferred over to Brazil for rendering. The final look was achieved, not in 3D, but in 2D, by combining about 20 different passes in After Effects. This meant treating them with a variety of glows, masks, distortions and adding light rays using Trapcode’s excellent Shine plug-in.”

Alain Bielik is the founder and special effects editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications and occasionally to Cinéfex. He recently organized a major effects exhibition at the Musée International de la Miniature in Lyon, France.

Tags