Using Unreal Engine-based virtual production tools, VFX supervisor Kevin Baillie helps the Oscar-winning director pre-shoot every live-action scene needing digital characters, including mice, chickens, and a cat.
Innovative, Oscar-winning director Robert Zemeckis' approach to his latest film, The Witches, based on the Roald Dahl novel of the same name, called for extensive animation and VFX work to bring the story to life. While the 1990 version, starring Anjelica Huston, primarily made use of practical effects to tell the story, Zemeckis wanted to make use of all the digital technology developed since then to create a much more photoreal film. This would require extensive character and creature animation, set extensions, compositing, face replacement work, and more to create a magical world filled with frightening witches and children transformed into mice.
VFX supervisor Kevin Baillie, who previously collaborated with the director on highly-inventive features such as A Christmas Carol and Welcome to Marwen, chose to bring the entire job to Method Studios because the company, which recently acquired VFX powerhouse Atomic Fiction (co-founded by Baillie), possessed the needed production capacity; Method Studios' talent and studio locations around the world, possessing the eclectic skillsets necessary to handle the massive assignment, were perfectly positioned to provide the type flexibility Zemeckis demands of the VFX process.
Set in 1967 at an Alabama resort that happens to be hosting a witches’ convention, The Witches stars Octavia Spencer as the grandmother of young man - Charlie in the book, simply Hero Boy here - (Jahzir Bruno), who, along with friends Bruno and Daisy, is transformed by the malevolent Grand High Witch (Anne Hathaway) into the body of a mouse.
In addition to developing and animating mice bodies hosting the children's personalities, the massive VFX job also involved creating the frightening reveal of the Grand High Witch's true countenance with decidedly non-human features, her all-CG cat companion, and a great deal more creature and virtual set work. The film's visual effects ultimately required coordinated efforts from Method facilities in Vancouver; Montréal; San Francisco; Melbourne; Pune, India; New York; and Los Angeles.
A Virtual Production Workflow
For Zemeckis, it was important to direct the CG characters’ performances with the kind of flexibility and interactivity normally associated with directing flesh-and-blood actors. It simply wouldn't do to shoot and design plates where directorial decisions about pacing, coverage, camera movement, and so much else would be locked in, with visual effects that had to conform to those decisions.
Baillie brought Method in at the start so that Zemeckis, production designer Gary Freeman, cinematographer Don Burgess, ASC and others could be involved before any physical sets were constructed, live-action scenes shot, or CG characters voiced. "Visual effects didn't just get started after production," Baillie explains. "They were part of prep."
Every set with any virtual characters in it was essentially built three times. The first time was by Method, in which sets were created virtually based on input from Zemeckis, Freeman, and other department heads. This data was then ported into Unreal Engine so that Zemeckis could make decisions and adjustments to performances and camera movement; Freeman could then adjust his set designs based on Zemeckis' choices. The second version was the physical construction of actual sets. The third was the actual VFX Method subsequently produced.
"Visual effects is part of the shoot," Baillie elaborates, “and it's part of post-production. And by VFX informing the process early, it benefits the film as a whole."
The character animation started out with initial character designs, built primarily in Maya at Method's LA studio, with some additional work handled in Vancouver. "These were very simple versions of the mice," Baillie says. "They were sort of video game quality versions with some rigging and animation cycles - instructions like 'run', 'turn left', 'turn right', 'stand up' - all sorts of canned actions that you can use as building blocks for animation."
They also took sets from Freeman's designs and built virtual versions in Maya; those were imported into Unreal Engine where first-pass lighting was created, then synchronized to a server Method shared with British VFX studio, NVIZ. NVIZ would then take Method's virtual sets, populate them with animation from Method, NVIZ, and the production’s own in-house team and, with that “witch’s brew” of ingredients, facilitate Director virtual camera layout sessions.
In these sessions, where Zemeckis himself operated a virtual camera with real-world lens equivalents, he could pre-direct every one of the film’s scenes that had digital characters. That process would in turn inform both Freeman's subsequent building of the physical sets and Method's character animation and virtual set work that would follow in post.
During principal photography, the production used these same virtual set environments in Unreal Engine, combined with Ncam for live camera tracking, to visualize set extensions and character animation “in-camera,” in real-time as they were shooting. Baillie recalls, "The ability to show our entire crew the architecture of a full room rather than simply a blank bluescreen, or mice characters running through a set rather than trying to illustrate their paths with a laser pointer, was invaluable for creating believable shot designs."
All this effort, Baillie explains, meant that by the time work began on all the final CG characters and environments, they'd already gone through a significant process of trial and error.
Zemeckis required interactive decision making throughout the entire VFX creation process, so Method built flexibility into its rigs and skeletons, etc. to ensure that was possible. But thanks to the virtual production process, Baillie explains, "everyone was starting further along than they ever would have otherwise."
Building the Better Mouse
The three mice needed to look realistic while also feeling to the audience like they ultimately possessed the personalities of the three children. Naturally, much of this was accomplished through voicing and the script. But it also required effective character animation. For the character development phase, Jye Skinn, Animation Supervisor, Method Vancouver spent two months with Zemeckis and Baillie at the director's Santa Barbara base of operations. There they would present the director multiple iterations of every type of mouse movement available, adjust based on the director's notes, do a blocking pass and a rough animation pass, and present the new version.
"There is a big difference between animating a creature - say a tiger or a dog - and this type of character animation,” Skinn observes. "We all know what a tiger or a dog looks like. You can download reference material and loosely copy that. But when someone's acting emotion and feeling, it's a whole different thing. Animators are always thinking about subtext and finding different ways for the character to emote without being too on-the -nose about it. It's ultimately a lot more about acting than technology."
Zemeckis, who as director was the final arbiter of what mouse behavior worked or didn't, was very eager to have interactivity with the process that isn't always part of the animation production. He wanted to be able to "direct" performance from the early stages of character creation through to the edit. Baillie knew that Method could work iteratively in a way that the director would feel comfortable with even though a lot had to happen under the hood, so to speak, to facilitate that kind of work method.
"Even if a change in the performance seems small," Skinn says, "it's not small! We might just need to make one of the mice run a little faster in the animation. It sounds like a little change, but it's a big change."
Method Vancouver took the lead in developing the mice -- their look, movement, the rigs, and all the technical backend. "We really worked with Bob on creating a holistic character with characterizations that would help make them feel more fleshed out," says Sean Konrad, VFX Supervisor at Method Vancouver. "The Montréal office also took on some of that work focusing a bit more on action-oriented mouse action" as opposed to the closer, more emotional character animation.
"There are two scenes," Konrad elaborates, "where a lot of what we need to show is Daisy's frustration with Bruno. There's a moment where he gets stuck in a grate and the other two [mice] run across and pull him out. She needed to look back frustrated when he's stuck in there. Sort of like, 'Here we go again!' Or there are moments with Hero Boy - the team's name for the mouse version of Luke - and he's a little cheeky, but for the most part, he's just good. He wants to be helpful at all times and that needed to be reflected in his body language."
Naturally, the faces are the most significant indicators of a character's emotions and the design of the mice faces (what features animators can control and how) was a major step in the process. Using proprietary facial animation tools within Autodesk Maya, Konrad shares, "we made some adjustments to the type of facial rig work that has been used to control various human and animal facial movements in a number of films so that they were specifically effective for our needs with the mice and witches."
Facial rigs in general, he elaborates, "are essentially ways of describing facial movements. A smile, a sneer, pursed lips - all that - so you break all that out into sliders so that you can create a full range of motion that ultimately becomes a face. Typically, a face rig has about 200 shapes in it. For mice, they ended up with more like 400 or 500."
Much of this work was accomplished by a team at Method's San Francisco location, which focused on both the mouse faces and the Grand High Witch's true, gruesome face. This group of Method artists, formerly of Atomic Fiction, had extensive experience working previously with Baillie and Zemeckis.
"We create the facial rigs to determine what they can do and where they can go," says Traci Horie, Animation Facial Development Lead. "We also figure out what shapes are needed to make the expressions that are required, primarily within Maya and the modeler Pixologic ZBrush,” she adds. "Obviously, the topology of a mouse face is a lot different from a human face. To begin with, their nose is much longer. And so, the type of expressions that they would make have to be created differently from what we would do for a human face."
A smile, for example, "which for us generally moves up and out, on a mouse would move more in a straight back way. So, we would try that. But sometimes what would happen on a mouse is not going to accomplish what Bob is looking for. The more realistic smile really didn't read well. We looked at what other movies did with the same problem. I watched Rescuers Down Under and a lot of the CG movies out there, because obviously they had to tackle this same issue in three-dimensional space. With a lot of back-and-forth, we got to the point where we needed to be."
Other considerations Hori explains are questions such as, "If I were to move my eyebrows as a mouse, where would those eyebrows actually be? How would the skin fold? There is a lot of work figuring out all those types of things."
To make sure the animators could experiment with different combinations of approaches, Hori and her team added extra controls that wouldn't normally be part of a facial rig. "The cheek and nose areas are so much broader than on humans," she notes. "If we put motion [controls] just around the mouth or around the eyes, then you'd have this huge mass on the nose that just wasn't moving at all. So, we put built the rig with extra controls with extra shapes, so that you could move the upper part and lower part of the nose separately to give you a kind of flow to keep it alive. The animators could choose to use it or not, but we put it there so they would have the option."
The additional mouse work, including modelling, fur, and grooming, was primarily spread out between Method's Montréal and Vancouver locations. This was the stage where characteristics of real mice were input into character designs where appropriate and rejected if not. "A mouse's heart rate is very high, 120 beats per minute," Skinn points out. "It's always twitching and moving its whiskers and we used that information sparingly because it looks weird to have character look that jittery."
On the other hand, they did make use of some detailed research. "Mouse whiskers," Konrad learned, "have like a sinusoidal wave so that when one twitches up here, then it creates a sort of bouncy curve there. There are about 25 whiskers on each side of the face and when the front ones react to something, you'll have a wave that goes all the way to the back. It's an intrinsically mousy thing that we did make use of."
You've Got a Hideous Smile!
While the witches at the convention appear quite normal at first, it is soon revealed they "really" appear quite scary when their mouths open in a very non-human fashion; they sport terrifying grins that literally stretch from ear to ear. Dahl's book uses some very frightening language to describe this, which provided a starting point for the VFX teams.
First, they developed the facial rig for the animation which would eventually be composited onto the live-action footage of Hathaway. This proved challenging, both in terms of making it believable technically, but also in searching for an effect that's appropriate for a children's movie.
"When she would get angry, she would rip the side of her mouth into this giant smile with monster teeth," recalls Christian Emond, DFX Supervisor Method Montréal. As early passes of the animation made their way to Santa Barbara, Emond notes, "people felt it looked too scary. It started out as a two-dimensional drawing and that was OK. Everybody in Montréal was pushing to make the 3D version as scary as possible - really photorealistic, with a mouth that tears open and details like saliva - and it looked very much like real life. Once we had a photorealistic 3D version, we got word, 'Can you have this at 50%? Maybe at 25%?’"
As with the mice facial rigs, the San Francisco team handled the Grand High Witch's facial rig for these moments. "It isn't just developing the rig," Hori points out. "It's also about figuring out what shapes are needed to make the expressions that are required. And in the case of the Grand High Witch, we needed to make sure that, of course, the shapes ultimately came together to look like Anne Hathaway," since all the CG facial work would need to blend seamlessly with Hathaway's face.
This too was an iterative process. As Zemeckis, Baillie and others requested subtle changes in the grin's characteristics, Hori would rebuild the facial rig, which was an elaborate process; every small change could affect a whole series of other movements and, in turn, require rebuilding of the shapes necessary to ensure the grin still looks like it belongs to the witch doing the grinning.
A Cat, a Chicken, and More
While the other Method locations focused on the above tasks, Method Melbourne, led by VFX Supervisor Glen Melenhorst, took on additional shots, including some creature animation of the Grand High Witch's cat and a chicken, among others, as well as some set extensions.
"A problem with things like cats is they're pretty much loose skin on a bag of bones," Melenhorst says. "They change shape very fluidly. They're not like a T-Rex with really defined muscles. So, a lot of work went into the skinning, the flesh, the sliding skin, and the groom. We searched a whole lot of YouTube movies of cats and created little video grabs of cats jumping - cut them together with little bits of video that have a whole bunch of idiosyncratic movement that we wanted to capture. That included the way a cat's head turns, the way it licks its paw, and the way it turns from a sit to a stand to a walk."
The CG cat could be modelled on real cat behavior, but there was a limit to how much YouTube videos and other observation could go into the chicken body inhabited by a girl's soul. "The chicken had to look distressed and sad," Melenhorst says, "and that made it necessary for us to add some small movements to the chicken's face, particularly around the eyes, that real chickens don't have."
Bringing it all Together in a Pandemic
Baillie explains that the project was in full swing when the pandemic hit. "These days," he says, "when working with multiple VFX companies, or even a single company with teams spread out geographically, we have to be good at communicating ideas to people all over the world.”
"When COVID-19 hit and we needed teams of hundreds of artists to go from working in a handful of studios to everybody working in their living room, we were confronting an issue we were already working on solving, but had to speed up,” he adds. “We're used to working with eight locations, now let's make it 500! But fortunately, Method had been laying groundwork for this kind of communication among artists for some time."
Upfront Work Pays Dividends
Baillie stresses that by having Method handle every facet of the VFX production, from early asset creation to final delivery, he was not only able to get the level of quality he needed for everything appearing on screen, but he could ensure the whole production process provided the flexibility the director required.
"Even though it went through a third party in the middle," he says of NVISION, "Method already had a lot of data and they'd settled on naming conventions and the rig structures and everything they needed to get started. What you really want from virtual production is a system where from the start of virtual production to the finishing of the final shot, it's not about 'build it, throw it away' and then 'build it, throw it away.’ It's more a process of starting out with this seed and letting the seed grow. Because Method was there to plant that original seed, they were the perfect company to bring everything to fruition."
The Witches is currently streaming on HBO Max. Earlier this month the film was nominated for three VES (Visual Effects Society) Awards.