Production VFX supervisor Grady Cofer and Co-Production Animation Supervisor Hal Hickel discuss how the most recent season of Lucasfilm and Disney+’s hit series – nominated for an Outstanding Visual Effects Emmy - took advantage of new digital tools and processes to better plan and capture more realistic acting performances in-camera, as well as animate their integration with CG characters and sequences.
It’s not hard to believe that The Mandalorian, which launched the Star Wars universe on Disney+ and sparked the ever-growing virtual production craze, is being considered for a fourth season, or a possible feature film. The show has been a fan favorite since launching in November of 2019. While it is unclear which direction Din ‘Mando’ Djarin and his apprentice Grogu will take in any new projects, there is much to be discovered about them and their story in the eight episodes of Season 3. With usual collaborator Richard Bluff working on The Book of Boba Fett and Ahsoka, creator Jon Favreau partnered with Production Visual Effects Supervisor Grady Cofer for the first time and reunited with Co-Animation Supervisor Hal Hickel; their stellar work recently received an Emmy Award nomination for Outstanding Special Visual Effects in a Season or Movie.
According to Cofer, “Compared to the first season, the virtual production technology has gotten faster and is able to support more complex imagery; however, the most fun tool involves character animation. You could roll camera, say, ‘Action’, start directing the scene and then you have these triggers where this character walks or that droid rolls in the background all playing on the wall. That’s a new aspect of technology we wanted to take advantage of. Plus, the speed of making adjustments to these environments like the ability to relight things live.”
Previs of the camouflaged trinitaur in Episode 307 helped the production team get proper reactions from the cast during the shoot. “We took that animation cache, brought it into the volume, hit go and all of a sudden that creature starts showing up on the horizon and we’re heading towards it,” recalls Cofer. “The whole thing is lumbering. Jon Favreau didn’t want it to be attacking but the skiff accidentally gets in the way. The ceiling in the volume is LED so you could see everybody on the skiff start ducking as they’re watching that tail as it comes down on their heads. It gave the performers something to respond to and all of the sudden the shots presented themselves. It was like they were really out there on the skiff shooting a massive creature.”
Hickel shared animation supervisor duties with Paul Kavanagh on the show. “The trinitaur was similar to the challenges that we had on Pacific Rim where we figured out strategies for kaiju-scale creatures and robots moving at appropriate speed, so they feel huge,” states Hickel. “When the trinitaur brings its tail down on the skiff, that can come down fast, particularly from the perspective of the humans as it smashes right down on the deck and breaks it half. Then you’ve got an explosion of debris which is kinetic and high speed.”
The Mandalorian foundlings rescue from a raptor’s nest echoes something out of a Ray Harryhausen movie. “Ray is always an inspiration,’ remarks Hickel. “He would have embraced virtual production because the thing people forget, particularly when you have discussions about old school visual effects versus digital effects, is those pioneers were pushing the technology of their time. Ray’s Dynamation process was a refinement of rear projection techniques from the past.” The evolution of virtual production technology continues to be driven by the desire to expand the filmmaking toolset. “Being able to have assets early on in production, go into VR and have all of your department heads together and maybe crawl up into the nest and try out some ideas and see that live and plan out your shots,” notes Cofer. “It’s a huge advantage to be able to explore things visually that way in real-time.” One example of how such new tools are being used is the dinosaur turtle, the first major creature introduced in Episode 301. “All of the logistical questions pop up,” states Cofer. “Do we shoot that in water? Do we need some kind of representation of the creature there for the camera and stunt people to see? Do we need sections of that creature for people to interact with? Those are great ideas and usually the answer to all of that is, ‘yes’.”
Critical considerations such as proper on-set lighting always required careful planning. “When Bo-Katan attacks the spider-mech thing that has captured Mando in the cave, there’s a beat where she slides on one knee and goes underneath it with the dark saber,” remarks Hickel. “There had to be a thing there for her to pass under so she would be properly shadowed, making sure that she held the dark saber at the right angle so what she was reacting to was the right height for what we were going to be adding later.”
Phil Tippett was consulted for the sequence, which has a visual aesthetic that resembles his stop-motion feature Mad God. “The descent into Mandalore was intricate and planned out,” explains Cofer. “It’s an incredible vertical from above the clouds to the surface of Mandalore, the ruins of Sundari, and the beskar mines. Jon wanted that descent to be nightmarish, so he reached out to Phil Tippett, who has an amazing ability to imbue his designs with a nightmarish quality, like Mad God. He and his team built a miniature set of the sewer system and took handcrafted creepy little totems and made a route system that went down into sewer and integrated into the mechanics. They lit it, sprayed it with shellac, and shot it with these still images. Jon loved all of that so we used that not only as reference but actually scanned a lot of those assets and incorporated them into our sets. I had a couple of reviews with Phil and he would always have interesting guidance about the tricks he uses to give you that creepiness factor and we appropriated those ideas. That shot where you see Mando and Grogu walking forward in that sewer and you can’t tell what’s mechanical or organic, that feels creepy. It’s all Phil Tippett’s influence.”
Season 3 featured expansive aerial battles that erupted between combatants propelled by jetpacks. “Paul Kavanagh grabbed lots of video references including several different movies that had jetpacking in the past and put together this huge presentation for Jon Favreau, David Filoni and Rick Famuyiwa,” recalls Cofer. “What we discovered to be the key was having good takeoffs and landings, which are not easy things to animate. But when you have a stuntperson on wires they’re going to give you something interesting, especially when the direction is to make it messy, which is often what Jon likes. When Mando jetpacks down to the landing pad in Kalevala, he hits the ground hard and slides forward. That’s a tough thing to animate.”
Hickel goes onto explain that animating the flying is not the hard part. “Intuitively it makes sense and there is actually some good reference now of crazy people who will strap little jetpacks to their backs with wings, so they fly [like Superman rather than just vertically],” he observes. “The tough part is when people collide, wrestle in midair or and fight with weapons. There is nothing in real life that you can look at and study. We have to use our animator knowledge of, ‘Where’s the thrust coming from? How is the body suspended from the jetpack? Where is the weight at? What would happen if these two people met in midair and grappled with each other?’ A lot of it is imagination but we hope that enough of it is quasi-correct physics to make it feel real enough on the screen with the regular live-action drama that we want the audience to care about.”
Another key sequence is when IG-12 gets modified so that Grogu can be placed inside and pilot the droid. “That largely fell to the brilliant folks at Legacy Effects,” reveals Hickel. “As the seasons have gone on with The Mandalorian, we do Grogu as a CG character less and less. There are still important times to do it and it’s largely when there’s some action that is impossible to get as a practical as opposed to a performance moment. Legacy Effects even had a full-sized head to toe version of IG-12 with Grogu that could walk, which had a puppeteer behind it. It was remarkable. If we had our heroes running down a hallway and IG-12 with Grogu is jogging along with them, it will be CG, along with other cases where for one reason another it had to be CG.”
Pairing with IG-12 allows Grogu to take his childish antics to the next level. “When Grogu immediately takes to the controller and starts walking around the room bumping into things is one of my favourite moments in all of The Mandalorian seasons,” remarks Cofer. “To me it’s charming. It’s great that we get all these shots in camera. Mando is really doing a comic gag against a puppeted version of IG-12, like when Grogu is stealing food and stuffing his face. Mando is trying to get him to stop and holds him away. It’s like when your child is taller than you and all of a sudden you can’t stop them from getting into trouble. It’s a great moment and they were able to mock that up practically. The cost is that you’re actually seeing the puppeteer. It’s a big removal for us. We would shoot extra plates. But that’s a small price to pay for the quality of the performance you get in camera.”
R5 was largely practical, but in Episode 308, a CG version was rendered in Unreal Engine for when the droid flies down to help Mando control some of the gates and turn alarms off. “One of the great things about virtual production is by the time you’re shooting, you already have fleshed out versions of the assets and environments,” observes Cofer. “The Imperial Chamber, the big chasm underground, we had a beautiful version of that play live on the walls in StageCraft for a backdrop whenever Bo and Mando were fighting Moff Gideon. We also had a very good R5 in Unreal Engine. It was something that Jon was interested in. We have great assets and a nice environment. Is there a piece of this that we could finish right out of Unreal Engine? That seemed like a great candidate. Whenever Mando calls R5 for help, as soon as R5 starts flying in, these are all Unreal Engine renders. A great advantage of that is when Hal and Paul were guiding that animation, they would open Unreal Engine, play the animation, and say, ‘Why don’t we have the camera tilt a little faster and maybe R5 gets closer to us.’ The artist was doing all that in real-time, moving sliders and rotating. It gets played and you’re watching the scene look like a finished piece with those new animation notes already applied. It’s a hint of where this industry might be going. This collision of high-end visual effects and real-time computer graphics.” Hickel laughs before adding, “It’s heading to a nightmare world where supervisors stand over the shoulders of animators and ask them to make changes in real-time!”