Complementing the 12-foot tall purple villain created by Weta Digital for action-packed fight sequences, Digital Domain employs proprietary tool Masquerade using new machine-learning algorithms to deliver emotive performances from the CG character.
How many superheroes does it take to defeat the biggest tyrant in the Marvel Cinematic Universe? All of them it seems. At least all of them try, anyway. But Thanos, the giant CG star of Avengers: Infinity War is too bad and too big to fail. Directed by Anthony and Joe Russo, the Walt Disney Studios release has become the fastest film in history to capture $1 billion at the box office, achieving that high mark in only 11 days.
Actor Josh Brolin voiced Thanos’s voice and provided motion capture data for his face and body. Two studios, Digital Domain and Weta Digital, modeled, animated, and rendered the character.
Weta Digital created the 12-foot tall purple villain for sequences on Titan in which Thanos fights Ironman, Spiderman, Dr. Strange, Mantis, and other assembled Avengers. Digital Domain put Thanos in some of the more emotional scenes with his adopted daughter Gamora (Zoe Saldana) at the beginning of the film and in later sequences, as well as in various fighting sequences with Hulk, Loki, Thor and in the Wakonda battle, and in the dramatic conclusion of the film.
Writes Washington Post critic David Betancourt, “While it is impossible to humanize someone hellbent on destroying half of everything that exists, Infinity War manages to show there is more inside Thanos’s heart than destruction.”
And, c/net’s Aloysius Low, “Thanos is wonderfully animated with a wide range of emotions from anger to joy to even sadness….We can’t help but feel for him, despite his horrifically evil plan of galaxy-wide genocide.”
We spoke with Kelly Port, visual effects supervisor at Digital Domain, who led the team of artists at the studio who infused the CG character with Brolin’s performance. To help the animators, a proprietary tool called Masquerade developed in time for this film used new machine-learning algorithms to advance the facial performance capture process.
To build the character, modelers at Digital Domain worked from concept art provided by Marvel’s art department. As they developed Thanos, the overall visual effects supervisor Daniel Sudick would share the model with Weta and vice versa.
“We started off with a shared model, and sometimes would share models through updates early on, but once we were in shots, usually we communicated through reference,” Port says. “Weta would send us shots to show how Thanos looked in sequences we weren’t doing and vice versa. Dan Sudick was the gatekeeper for continuity for all the shared characters.”
For their part, Digital Domain’s artists then rigged the model to provide controls for animators, created textures and shaders that determined how the purple skin would react to lighting conditions, and set up muscle simulations and cloth dynamics. They grew fine hair on Thanos’s body and arms and stubble on his head, and gave him armor and a gauntlet.
“All that work takes quite a long time,” Port says, “and as we start to do shots, we’re still tweaking things. We will make tweaks on a shot by shot basis.”
The team used Autodesk Maya for modeling, rigging and animation; Chaos Group’s V-Ray for rendering; SideFX's Houdini for effects; a proprietary Maya plug-in called Atomic for lighting; and Digital Domain’s new Masquerade tool and their custom Direct Drive for transferring motion capture data to the CG character.
On set, the Marvel production team created a motion-capture volume using cameras integrated among props and set pieces. Actors being captured there wore typical motion capture suits. In addition, Brolin (and actors whose CG characters would have dialog) wore a helmet equipped with two HD, 60fps cameras pointed at a set of about 150 dots on his face. The team captured face and body performances at the same time with Brolin on set with other actors and while the Russo brothers directed his performance. As is typical when an actor performs a character that will be digital, data captured from the cameras moved a low-resolution mesh of a digital face as Brolin acted.
The trick at this point for any similar project is to move this low-resolution data onto the CG model in a way that includes as much of the actor’s subtle and nuanced expressions as possible. Sometimes, talented motion editors tweak the low-res data before it’s transferred. Sometimes to get higher fidelity than the helmet cameras can provide, the actors do a separate high-resolution capture session with the actor repeating the dialog. For this film, Digital Domain artists did something different and new.
Before Brolin’s on set performance, the team did high resolution captures of his face -- his expressions, the limits of his facial range, how his skin wrinkles, and so forth -- using Disney Research’s Medusa system.
“We had Josh [Brolin] do FACS shapes during a Medusa session,” Port says. “But it wasn’t just the 25 FACS shapes. We kept the camera rolling to capture his moving face because there are so many shapes between one FACS expression and the next. We can get the complexity of the underlying bones and muscles and how they interact with the skin.”
The team fed that data into the new Masquerade system, so that it would learn what Brolin’s face looked and acted like in high resolution. Then, for selected shots, they fed the low-resolution data taken from the helmet cameras during Brolin’s live performance into the system. Masquerade converted the 150 facial data points from the live performance into roughly 40,000 points of high-resolution facial motion data. The algorithms essentially converted the low-resolution data into high-resolution data based on the system’s knowledge of Brolin’s face.
“Darren Hendler [the head of the studio’s Digital Humans department] is the one who focused on Masquerade,” Port says. “Masquerade would automatically do a transfer that was pretty good right off the bat. Then, we would feed it corrections to get a better result and it learned. As production went on and we fed it corrections, it would get smarter about its solutions. By the end, it was pretty accurate. It wasn’t like we didn’t ever correct the facial performance. We did. But Masquerade was quite successful and saved us a huge amount of time in making sure we had the subtleties. I think that’s the magic of this whole thing. To take a low-res mesh capture from the live performance and have the same fidelity as if it had been captured separately in high-res with Medusa. It was the first time we used Masquerade. It was amazing.”
The second part of the process involved moving the data from Brolin’s face onto Thanos’s face. For this, the team used Digital Domain’s Direct Drive software. This software creates a map for each of the two faces, defines the correspondence between them, determines how different elements of their unique anatomy align, and then transfers the high-resolution Masquerade data. To help make the transfer more accurate, the team first fed Direct Drive a range of Brolin’s performances and facial exercises and modified the choices made by the software as needed.
“Transferring the data to the character is not straightforward,” Port says. “It isn’t just matching the corners of the mouth and the corners of the eyes. There are a lot of complex behaviors, so we put a lot of time, research, and energy into making that transfer as tight as possible to convey the same emotions.”
The transferred data moved directly onto animation controls in a specialized rig so that animators could fine-tune the performance. To check that Thanos had the same facial performance as Brolin, the team “locked” the heads by removing any camera movement so the animators could see side-by-side comparisons.
“On any of the subtle performances, we checked every frame to see if Thanos was making the same emotional expression as Brolin,” Port says. “If not, we’d try to determine what wasn’t right. Sometimes, the lips might be more compressed or more open. Sometimes, a tiny brow raise or little furrow was missing. The face is so complex. And their faces are different. If Brolin did something with his chin, it would look very different in Thanos, but fortunately, not a lot of stuff happens there. We concentrated on the areas that are really important for facial emotion, the eyes and the mouth, in that T-zone.”
There were no markers on the actor’s eyeballs so animators needed to spend time making certain that Thanos’s eyes matched Brolin’s performance. And, although Brolin sometimes stood on a platform so that the other actors would look at the right spot for the giant Thanos, the animators sometimes needed to correct the eye lines.
“We fed the surrounding tissue around the eyes into the system and tied it into the transfer as much as possible, but animators needed to put a ton of time and energy into the eyes,” Port says. “And, Thanos had a much bigger mouth and lips. The inside of the mouth where it’s hard to put markers is tricky. We’re working to solve those issues and we’re close to being able to get those uncapturable areas. Maybe on the next film, which would be really cool.”
In addition to Thanos, the crew of 350 at Digital Domain created a variety of environments, snow that fell down and then up, other effects, and other characters, many of which were shared with other studios -- Thanos’s henchmen Ebony Maw, Proxima Midnight, Corvus, and the Black Dwarf, for example. But, most of the 520 shots the crew worked on included Thanos, and Thanos embodied the most difficult work.
“Even though we had a ton of work on effects and environments, and each department had its own focus, whether lighting or animation, what drove everything was making sure the character’s performance came through and looked as photoreal as possible,” Port says. “We had a lot of fun watching the character grow, seeing the different layers of complexity that Brolin was able to put into his performance and evolve through the film. It was super rewarding for everyone. Thanos isn’t a one-dimensional villain. We were very excited to work on such a complex character.”
That audiences could feel even a moment of sympathy -- or empathy -- for this monstrous villain is a tribute to the talents of all the visual effects artists who helped create Thanos.