Alain Bielik finds out the horrifying secrets behind the monsters in the new Korean film, The Host.
Spaceships are pleasant to do, invisible visual effects are challenging, weather vfx are ok, but when it comes to sheer fun, nothing comes close to the excitement of creating a good old movie monster. There is arguably no better showcase for a visual effects studio than a giant monster trashing around in a city! In The Host, the city is Seoul, South Korea, and the monster is a 40-foot long creature that emerges from the river. The unique creature was conceived by director Bong Joon-ho and visualized by designer Jang Hee-chul. Based on detailed artwork, a maquette was then created at WETA Workshop, New Zealand, and scanned at Gentle Giant Studios in Los Angeles. When visual effects get global
To create the 125 vfx shots of the movie, the director turned to The Orphanage in San Francisco. The company set up a team led by veteran vfx supervisor Kevin Rafferty, executive producer Marc Sadeghi, CG supervisor Shadi Almassizadeh and vfx producer Arin Finger. While The Orphanage had done a few CG characters in the past, there wasnt really an established creature pipeline and workflow, Rafferty admits. We used The Host as a vehicle to leverage our existing Brazil-based pipeline and workflow into one that was more robust. The largest addition to our workflow was establishing a creature department and its supporting software, and feathering that into our existing pipeline.
One Creature, Two Teams
With a tight creature build window, the approach was divide and conquer. One team, led by Stephane Cros, did early rigging development for the body and face, which required its own unique controls. A separate model team, led by Brook Kievit and Sasha Pouchkarev, split the scan data up using Paraform, and built the model using Autodesk Maya 6.5 and Nevercenter Silo 2.0. Zbrush was employed to add detail at the modeling and texture mapping stage. By dividing up the team, we were able to get temp rigs to the animation team much earlier to begin walk cycle testing, Creature supervisor Corey Rosen notes. How the creature moved helped the director envision how the model should look. So, crucial questions got answered in the proxy model stage, meaning less time would be wasted later revising a further developed high resolution model.
In the modeling stage, the crew adopted a Polygon Box Modeling technique to ensure that the Host would have the right amount of detail and that the geometrys edge flow would allow for extreme deformation. This turned out to be a key decision and proved very beneficial, due to the extreme poses and actions the Host had to perform, Rosen continues. For facial performance animation, we adopted a live cluster-based facial animation system. This approach saved a huge amount of time and was extendable to a number of special-case rig situations. A key strategy was the use of a Multi-Skeleton Rigging System. It allowed us the flexibility to develop new replacement (or customized-for-shot) rigs on an as needed basis, and the interface(s) necessary for plugging them into the actively flowing creature pipeline. The Hosts rig, quite simply, would have been too heavy, slow, and cumbersome for any artists machine to handle if it was monolithic. By developing and implementing a Multi-Rig System, we were able to add layers of complexity throughout the development of a shot.
To answer the specific needs of the project, The Orphanage developed a proprietary Muscle Rigging Pipeline named oMuscle. The tool was based on a Maya plug-in made by Comet Digital. The crew also wrote a suite of customized Creature TD Muscle tools to make the process of simulating muscle dynamics fast and efficient. Max 8s pelt mapping interface was employed to cut down on the time required to UV the massive creature. Maps were created in Adobe Photoshop and implemented in 3ds Max. All the elements were then combined in After Effects under the supervision of compositing supervisors Steve Jaworski and Alex Prichard.
Adapting to a Unique Design
When it came to animating the creature, the search for appropriate reference footage became crucial. We gathered information from both real life and past films, Rafferty recalls. We looked at the T-Rex in the Jurassic Park movies for weight and mass reference. We looked at Draco from Dragonheart to see how rain looked on a large reptilian skin. And we looked at Predator and Blade 2 to see how other films dealt with a multi-faceted mouth, or maw. We even went to the fish market and bought a trout and a bass. We brought them back the studio, poured lighter fluid on them and set them on fire! We filmed this for reference on how a fish skin would look if it was set on fire.
One of the main concerns of animation supervisor Webster Colcord was trying to make the highly unusual design come off believably in terms of the physics of locomotion. Luckily, one of our animators, Bruce Dahl, has a tremendous library of reference footage that we pulled from, Colcord explains. We tried to make the Host a little clumsy on land in the first few appearances, but as the movie progresses, the creature becomes more comfortable on land. For the swimming scenes, we modeled the motion after a crocodile the arms were not involved in propulsion. When she uses her tail to grab things or swing, we looked at some incredible footage of a huge snake climbing a tree. In some of those tail-grabbing shots, we had to use a tail-only rig wherein the tail geometry would deform along a spline, but most of the time, we tried to do all of the required actions of the tail with one rig.
Colcords first major assignment was to develop the ubiquitous run cycle for our creature. For running, I looked at the famous legless performer, Johnny Eck, in the movie Freaks, as well other legless people. When you look at those real-life examples, its all about the transfer of gravity from one shoulder to another. The difference was that our creature has a tremendously long and heavy tail. So, our reasoning was that this tail would only lift off the ground when she reached full run speed. The nearest reference we could find for the weight and shape of her body was very large sea lions. So, we kept that in mind while animating weight passes bounce, jiggle in the creatures belly and on down through the tail.
Because of the tight schedule, the team was not able to put together either an expression or cloth simulation driven system to automate the animation of the flippers. And the creature design featured a lot of flippers and other dangly bits As a result, animators spent a lot of time creating and tweaking hand-keyed flipper animation. The CG team later added a layer of muscle jiggle simulation to the masses in the arms and belly.
Creating Physical Interactions
To better tackle the many interactions between the creature and the environment, Matchmove supervisor Tim Dobbert developed an innovative tool that sped up the whole animation process. It was a system that could stack up image planes with alpha channels for rotoscoped elements and render right out of Maya, quickly and easily, Colcord explains. This became a totally essential tool which allowed us to bypass slap comps for directorial approval. Once we had a temp or final rotoscoped element, we could easily get the creature deep into the plate to start to make judgments about integration.
Any time the creature or her shadow had to interact with the environment, the modeling department built proxy geometry that could be used by the animators and TDs to achieve visually believable contact. We also needed to develop a water system that could be isolated and integrated into the live-action water, Rafferty adds. To achieve this, we employed a combination of two 3ds Max plug-ins. We used Reactor to displace the water surface and generate the creatures wake, and we used PFlow for the wake foam. We had HDRI information for every vfx shot, which provided us with an accurate reflection environment when it came to rendering the water. There was one shot where we did need to generate the entire water surface. We knew this in advance, so we took extensive 360-degree panorama images of the environment as it was lit for the shot.
For interaction between the creature and the actors, Rafferty and his team employed different methodologies. Digital doubles of the main actors were used in medium to wide shots. For medium to close-up shots, the crew filmed the actors or their stunt double. Rigs and harnesses were employed for shots where the creature was grabbing a character. The rigs were designed so that either the CG creature would fully cover the rig, or there was some rig/wire removal necessary for the plate, or both. For a handful of close-up shots in which the creature regurgitates a character, the crew shot the action with a full size practical head puppet built by John Coxs Creature Workshop in Australia.
The most complex actor/creature interaction occurs when the Host lowers one character with her tail and regurgitates another one with her mouth. At the outset of production, we had made the conscious decision not to use blue screen or motion control, Rafferty remarks. This would give director Bong and cinematographer Kim much more freedom for camera placement and movement. The shot posed many challenges for the artists. First of all, plate restoration was a big challenge. There were so many rigs and cables in the plate (to support the actors) We did shoot a clean plate, but it was not accurate to the action plates motion. The roto/paint crew utilized these clean plates where they could, but often worked with the MatchMove artist, and projected a single frame (frame-averaged to remove grain) into the CG environment. That element would then be used for rig removal and paint, adding back the grain once finished.
Before the artist animated the creature, he first needed to use proxy characters and match the animation to the live-action actors motion. His next challenge was then to create a forceful, believable performance of the creature that obeyed the limitations of the matched animation of the actors. Once the creature was lit and we had our first composites coming through, we quickly saw another challenge. The tail was now releasing around the actor, but the clothing wasnt pinching and bunching up Our compositor very carefully mesh-warped and morphed his way through this challenge. This was the last shot finaled on the show!
Matching Practical Effects
In another shot that required on-set physical interaction, the creature drops down from the top of frame and lands on a truck. In a Hollywood production, the truck would have been gutted and collapsed by pulling internal cables, Colcord notes. But the Korean crew dropped two enormous weights from a crane onto the truck. We first had to paint them out. Then, we timed the front half of the creature to land on the truck with the first weight, and her tail landed in the bed of the truck on the impact of the second weight. She was scaled at her real-world size in that shot and locked onto match-moved truck geometry as she landed. Unfortunately, after the truck stopped shaking, there wasnt a lot we could do with her in animation: any little movement would have meant that the truck should react and we would have had to do a fully CG truck, which wasnt in the budget
For The Orphanage, the project definitely marked the beginning of a new era. Its amazing to see what The Host has helped our studio become, technically, and creatively, Rosen concludes. I am excited to grow the creature pipeline developed for this movie on our summer and fall slate of effects work and fully CG feature productions later this year.
Alain Bielik is the founder and editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications and occasionally to Cinéfex. Last year, he organized a major special effects exhibition at the Musée International de la Miniature in Lyon, France.