Search form

Defining Thanos: Weta Brings Motion Capture to New Heights for ‘Avengers: Infinity War’

Weta Digital and Digital Domain transform Josh Brolin’s motion capture performance into a believable and imposing CG adversary.

‘Avengers: Infinity War’ © 2018 Marvel Studios.

When filmmakers Joe and Anthony Russo (Captain America: Civil War) decided to make Thanos the driving narrative force of Avengers: Infinity War, Marvel Studios visual effects supervisor Dan DeLeeuw hired Weta Digital and Digital Domain to transform the motion capture performance of Josh Brolin into a believable and imposing CG adversary. “We developed Thanos in parallel with Digital Domain,” Weta Digital VFX supervisor Matt Aitken recounts. “We knew that if he didn’t work then the movie was going to fail. We brought all of our experience to bear. A team at Weta Digital had just finished War for the Planet of the Apes so we took all of that learning, technology and skillset that was applied to Caesar and worked it into Thanos with a couple of new approaches added into the mix.”

One of the new things put into place was the introduction of “this concept of the actor on-set,” according to Aitken. “In the past we would have our reference performance, footage, and motion capture of Josh Brolin performing Thanos and would have our digital Thanos puppet,” he explains. “We would then track the motion capture onto the Thanos puppet. Now there’s an intermediary stage where we are tracking or solving the motion capture onto a digital copy of Josh Brolin. This allows us to clearly tell if we’re getting all of the subtleties of performance onto the digital puppet because we’re comparing it to the actual Josh. Once we have a good digital copy of the performance then we can calibrate our digital Thanos. There’s a big range of different emotions that he has to go through on planet Titan so the nuances of that performance were important.”

Weta Digital VFX supervisor Matt Aitken.

Tests were conducted and review in regards to incorporating facial features of Josh Brolin into Thanos. “Some changes were made particularly around the mouth where we brought some of the details of Josh’s face shape into Thanos. It terms of around the eyes and brows we ended up agreeing with Marvel that Thanos needed to look like Thanos.” A high-resolution model needed to be produced that included skin pores and stubble. “We did a couple of shots where we push right in on his eyes. All of the details of Thanos are as human-like as we can make them because those are the visual touchstones that convince ourselves what we’re looking at is real. The broad aspects of Thanos appearance such as his color, the shape of his face, chin, scale, and height are larger than life but the more into the detail you get the closer he is to being believable.”

Thanos’s purple skin set against the orange daylight on planet Titan presented a challenge for the Weta artists. “Legacy Effects built a great prop of a bust of Thanos that we could use as a lighting reference. When the prop was wheeled out on set to take some stills under the lighting, the purple skin maquette under orange light looked grey,” Aitken recalls. “We knew straightaway that we were going to have to oversaturate the skin color to make that purple. It was the same with the Iron Man and Spider-Man suits; in order to get their signature saturated colors, we had to supersaturate them.”

“We took all of that learning, technology and skillset that was applied to Caesar and worked it into Thanos with a couple of new approaches added into the mix.” 

Eye jittering was introduced into the facial performance of Thanos. “We did have a procedural approach that matched the natural way that our eyes tend to scan around what we’re looking at,” says Aitken. “Because everything is being ray traced we’re able to create a reflection of the environment on the eye which plays into its believability.” The jaw rig was updated. “We’re still driving that motion primarily with the motion capture of Josh’s facial performance but instead of the jaw being a simple hinge there’s a more complex articulation that’s going on. The jaw is pivoting through multiple pivot points as it opens.”

Thanos was as complicated a character as Weta Digital has ever produced. “We tracked 150 points on the face of Josh with the motion capture and two face cameras. We extended his facial rig to include his throat and neck because Thanos has an incredibly broad and strong warrior neck,” explains Aitken. “Whereas in the past we would use the neck region as a transition zone from the facial animation rig into the body muscle-based rig. We had animation controls for his Adam’s apple movement and for the tendons on his neck; that wasn’t something we could naturally derive from the motion capture.” The armor worn by Thanos needed to be modified. “It was initially sketched out that his tunic had a complete metal collar which went from one end of the shoulder blade through to the other. We knew that if he tried to bring his hands together in front of him that the single piece of broad metal across his chest would restrict the movement too much. We went back to Marvel and suggested introducing subtle breaks in that single piece of metal so that he could articulate below the throat.”

A hockey goalie glove was worn by Josh Brolin to represent the Infinity Gauntlet. “Josh was motioned captured with that,” Aitken remarks. “It was great because his movements were restricted in the same way as if he was wearing the gauntlet.” The Infinity Stones were inserted during postproduction. “Josh mimed a bit of that. We added the Time Stone to the gauntlet. We were able to work in some specific Eye of Agamotto rune effect to make that moment pop. Then there is the expression on Thanos’s face at that moment is when he’s getting a huge surge of power as the Infinity Gauntlet is taken to the next level.” The different colors had to be maintained for the six Infinity Stones. “We had to override the natural colors of these objects under the orange light of Titan to make sure that they read in their saturated clearly defined hues.”

Manuka has been a game-changer for Weta Digital. “Moving from a scanline-based renderer to a path tracing renderer like Manuka means that all of those things that we had to create by hand and layer into the scene are fundamentally there because we’re rendering with a global illumination model,” says Aitken. “However, everything needs to be directable. If we get a note from the directors that says, ‘We want the reflection in Thanos’s eyes to be brighter here.’ We have to be able to control that because they’re responding to how the audience is going to interact with Thanos.” Keyframe animation is critical in getting the desired performance. “We had to interpret what Josh was doing on the canvas that is Thanos. There’s a process of making sure that translation has worked by an animator sitting down, reviewing what we get from the captured motion and when necessary tweaking it with keyframe animation.”

All of the scenes on Titan are CG. “We’re replacing the environment and plates with our digital version because we can light them more consistently,” Aitken explains. “In the case of Tony Stark [Robert Downey Jr.] and Spider-Man [Tom Holland] up to their necks is CG. The bodies of Mantis [Pom Klementieff] and Drax [Dave Bautista] were kept and parts of Nebula [Karen Gillan] were replaced when she’s weaponized. Thanos feels like he’s in the environment because his armor is reflecting it. But also any highlights bouncing off of the armor are playing out into the environment and onto the suits of the characters. When Spider-Man stands next to Iron Man and the blue RT lights on the Iron Man suit are glowing we’re seeing those reflecting on Spider-Man’s suit. We’re getting all of that from the path tracer.”

Reference photographs were taken of the Atacama Desert in Chile to serve as the basis for Titan. “The ground there is reddish in color so that played well for us,” Aitken notes. “There are these giant windmill structures several kilometers across that used to be the power system for Titan which have partially crashed to the ground. We put a lot of work into detailing those so they would holdup to any level of scrutiny. Then the other thing that we looked at was a lot of present day reference footage of Ancient Greek ruins and ancient Mayan civilizations; we wanted to have a sense an archeological dig site where you can see the evidence of the former glory of the city which is partially being reabsorbed by nature. There are broken stairs with glyph details on them.”

Other elements that needed to be produced for the sequence on Titan was a warped gravity and the breaking of a moon. “Thanos goes over the top and throws a moon at Iron Man,” Aitken chuckles. “That was an effect that was in progress at Weta Digital for eight or nine months with a series of shots that show a large-scale destruction event. The whole atmosphere of the planet changes after that. It goes from being a clear sky to being a dustier overcast darker environment because there’s so much material in the air. Those were some huge simulations that we ran and have our Synapse simulation engine setup so we can run large scale volumetric simulations in parallel on several machines and that enables us to get them done.”

As with Black Panther, Iron Man has a nanotechnology suit. “We weren’t able to review the Black Panther material at all because these films came out so close together that stuff was still being developed,” Aitken reveals. “It’s about getting into the detail of it. We’re running a multilayer particle simulation where we’re building the underlying framework in waves of particles. Then what we’re doing is taking the front leading edge of that particle simulation and converting it into a fluid simulation so it’s like a blobby fluid that is spreading across the arm where the suit is going to be formed. We wanted it to be an articulated controlled set of nanobots. Each one of them knew exactly what they had to do. It was that insolvent fluid crystalizing into a solid mechanical structure. It all had to happen fast but had to read.” Wear and tear were incorporated into the Iron Man suit. “Even when these nanobots would freshly build weapons on the Iron Man suit we would prebuild some scuffing and scratching into those surfaces because without that level of detail it was never going to feel real.”

The full facial rig from the Peter Parker digital double was covered with a cloth simulation so to accurately animate his dialogue through the mask. A new addition to the Spider-Man suit was a series of mechanical spider legs. “They went through several iterations just to make sure that we got the full degree of articulation that we needed because he uses them to move and to protect himself,” says Aitken. “That was fun to do and was a new aspect to the Spider-Man character.”

Successfully created CG characters still need the skill of an artist. “It’s that combination of motion capture and keyframe animation that is so important. We need the direct performance of Josh that is motion capture-base to base Thanos on but without the work of the keyframe animators you wouldn’t be able to create Thanos for this film. We were certainly aware that the success of the film rode on whether or not the audience was able to respond to Thanos and it appears that they have. We’re thrilled with the way the film is connecting with the audience as well as their reaction to Thanos.” 

Trevor Hogg's picture

Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.

randomness