Search form

Wētā FX Brings a ‘Universe’ of Visuals to ‘Guardians of the Galaxy Vol. 3’

From giant red spaceships to tentacled, toothy Abilisks, the studio’s VFX and animation teams, led by VFX supervisor Guy Williams and animation supervisor Michael Cozens, pull out all the stops for the final chapter of James Gunn’s MCU trilogy.

Currently the second-highest grossing movie of 2023, Guardians of the Galaxy Vol. 3, which completes the trilogy begun in 2014 with the first Guardians film, is yet another testament to the imagination and skill of writer/director James Gunn. It is also, like the sequel that preceded it, a brilliant showcase for the animation and VFX work of New Zealand-based Wētā FX, who, since their founding by Peter Jackson and his associates in 1993, have been among the leading artists and innovators in the digital entertainment domain.

For Guardians of the Galaxy Vol. 3, Wētā worked on 671 shots, and was the sole vendor for the creation of the colossal, ruby-covered spaceship, the Arête, which first appears as a 300-meter-high skyscraper on Counter-Earth before being revealed as a two-mile-wide structure. Wētā also built an entire city surrounding the Arête that was loosely based on Seattle, and brilliantly evoked the destruction unleashed when the Arête lifts off from Counter-Earth. They also contributed to character work on Rocket and Groot, wrangled the tentacled, mega-multi-toothed Abilisk in the horror-film-like pit sequence, and had primary responsibility for the outer space shots in the big final battle sequence.

Another day, another dollar.

We spoke with VFX Supervisor Guy Williams and Animation Supervisor Michael Cozens about Wētā’s multifaceted work on the film, which, unsurprisingly, turned out to have its fair share of challenges.

AWN: How early did you get started on the film and how long was Wētā involved?

Guy Williams: I don't know exactly how long the studio was on the show, but I imagine they started right at the end of Guardians 2 in 2022. [Production Visual Effects Supervisor] Stéphane Ceretti and [Production Visual Effects Producer] Susan Pickett reached out and started talking about what possible work might be coming our way. One of the many awesome things Stéph did for us was he engaged us as early as possible. Before they even shot the show, he wanted to turn over assets so that everything could be baked and iterated on as many times as possible to have the best chance of a good result out the back end.

Michael Cozens: It took between a year and a year and a half. Most of 2022 and into 2023.

AWN: What assets were you actually given? You always generate a lot of your own concepts and previs and stuff just because that's how you determine what’s actually needed to get given sequences done. What did you guys get to work from?

GW: They shared everything they had, so we got a full art package, including whatever previs had been done at that point. The main assets that they wanted to turn over early were for the Arête, because we all knew that that wasn’t a model that you’d be done with over the course of a couple of months. Two or three other models had conflicting details, so we had to rectify all that over time.

Even though Groot was established, we received Groot pretty early on from Framestore. We had a bunch of variations that we had to create, because Groot's a very dynamic character in that he’s often doing things he hasn't done before. Because of that, we had to build it in such a way that it can change over the course of a shot. It wasn’t just a matter of modeling something, it's modeling it so that effects can work with it.

They gave us all the digi-doubles because we knew that we were going to need good high-resolution assets for those. James Gunn has a great production designer, Beth Mickle, he typically works with. She always builds up a fantastic art department. Plus, you have Marvel’s art department. So, we were not ever suffering from lack of good artwork to start with.

AWN: How much time and hassle does it save you when you’re handed such an extensive amount of good artwork, so that you don't have to figure it all out yourself?

GW: You're asking an interesting question because it's not so much about how much does the artwork save you from having to work, it's how willing is the creative team, whether it's the director or the producers, to stick to the art that you're given. What's painful as hell is to get an amazing art package and start working on it, and then have people come in three months, four months later and say, "Love what you're doing, but we never really liked those pictures, so can we make it blue and round?" That hurts.

James definitely isn't that guy. He knows what he wants and he's willing to commit to it. He's talented enough that he doesn't need to second guess himself. What he and his art team come up with is compelling as hell, and you don't need to throw it all away and start over because it's going to work.

AWN: With Groot, and some of the other things you were working on, you collaborated with Framestore and other visual effects houses. Is that easier these days? How much can you really share, and how much do you have to rework so it fits properly into your own pipeline?

GW: The sharing processes are getting easier and better because there's not many shows that go to one vendor any more, especially large shows. Sharing has become not necessarily the exclusive exception, but it's definitely more the norm now. It's in everybody's interest to have better processes and methodologies and tools for sharing. We used a lot of USD on this show. What's hard to share is when the tools become more proprietary.

Even though you might create a model proprietarily, pretty much everybody uses Substance. Even though you might groom it proprietarily, pretty much everybody uses some permutation of an Ari curve, or something akin to that. So, it's real easy to bake those down into a format that anybody else can read, as long as everybody agrees to use the same tools to read and write. We can share a puppet easily enough when you dictate the positions of the joints, the links of the joints, and even some of the constraints on the joints. But you can't share a rig because everybody solves the problem of rigging with their own set of tools. I mean, there are some good off-the-shelf tools, but we definitely use our own tissue and muscle solvers here. If somebody were to give us their rig, we wouldn't be able to use it right off the bat.

You can share effects work a little bit, but it's easier to pass elements than it is Houdini scripts just because, once again, you start to get into very proprietary stuff. The other thing that’s hard to share is very complex environments where you start getting into instancing. Whereas you can share some instancing setups, you have to sort of trick the system, like you pass locators and say that you're going to instance here. You can share instancing, but you’re sharing the results and not the process.

For instance, on this show, we did not share the Arête, just because it was such a complex model. But we did share the build that we did for the command deck because there's not much unique geometry there. It's a bunch of red cubes and very interesting forms, but, at the end of the day, it’s a single pass instance. We give a bunch of locators where all the cubes are, and all you have to do is instance onto those locators and it's pretty straightforward. But even then, it didn't work as expected right away because USD is not a mature lockdown standard just yet.

 

AWN: Let's talk about the Arête. What was the process like on that, and what were the highlights and/or specific challenges in that incredibly huge build?

GW: In essence, we get in this artwork, and it's absolutely stunning. What was terrifying was that the majority of the artwork focused on the destruction that the Arête causes when it lifts up out of the harbor. It wasn't until we saw the previs and saw how little they splayed the destruction that we started to breathe again. The thing you have to understand is the Arête is a spaceship that's three kilometers across from one side to the other when it's sitting in the harbor in Counter-Seattle, Counter-Earth's Seattle analog. Even though it looks like it's just a red glass pyramid that sticks out of the harbor about 100-200 meters, the truth of the matter is that the wings of it reach out to the sides of the harbor and actually underneath the city. It's so large.

We knew that we were going to have to do this giant Arête. We knew that we were going to have to do a lot of destruction in space with it. We knew we were going to have to do a lot of destruction on the ground with it, as it rips its way out of the ground and goes up into space. We knew we were going to have to deal with just modeling it in the first place. Some of the models had it being black with red glass. Others had it being silver with red glass. We sort of landed in the middle and had black and silver. This complex, sci-fi, futuristic thing, clad in glass, with some machinery that looks like it can almost be buildings.

We realized right off the bat that we had to come up with a procedural way to build it, because, once again, three kilometers across. There's a lot of shots where we're within 10 meters of the thing, so the detail has to hold up. It wasn't like we had just a couple of areas where we got close. There's an entire scene where we fall out a window and fall down the entire length of the thing – it has to have detail all the way down that can hold up to being within 20 meters of the surface.

[Wētā Visual Effects Supervisor] Jason Galeon and his team launched into it by solving it as a sort of CityBuilder problem. We had the models team build us a low-res representation that sort of defined what we wanted it to look like. When I say low-res, it was still thousands and thousands of polygons. They also built a low-res version of the glass that clad it, so we had an understanding of positive and negative silhouettes. And then, we put all that into – I would say CityBuilder, but it was all very heavily rewritten to suit this because it's not a flat surface that you're putting buildings onto. It's actually a very complex undulating surface with positive and negative recesses.

The first thing you do is you clad the entire thing in geometry, giving it a lot of detail, and you clad that in more geometry so that it doesn't look like just instances sitting next to each other. Then, you lay the glass on top, but the glass itself has to run through a sort of procedural program that breaks it up into hundreds, if not thousands, of sheets, so that it doesn't look like the thing is clad in these thousand-meter sheets of glass. And those smaller sheets have to have bevels. They have to have detail on the back so that the glass doesn't look too simple. We actually had a lot of extrusions off the back of the glass, so that you get beautiful highlights in the front, but you'd also get complex refraction going through.

You still had to make sure that everything was under the glass, so what's behind the glass still has to have details. It was instance from the start, but you also have to make sure that none of that accidentally pokes through the glass because that'll screw up the ray tracing. And then, you have to do an instancing layer on top of the glass to make sure that the glass doesn't just look too low-res, because even though it's supposed to be these monolithic sheets of glass, you have no sense of scale until you start putting little boxes of electronic components on the corners, or antennas taking off, or lights or whatever. So there's layers upon layers of instancing to make sure that this all worked.

We knew from day one that this is a very effects-heavy show. We knew that this awesome Arête has to interact with the ground as it rips it up. We knew that it has to blow into pieces. We knew that we have to see details on the surface being blown apart. Obviously, that goes against instancing. So we worked with FX right from the very beginning to make sure that, as we built this thing, they would be able to take parts of it and locally de-instance it so that they could do destruction. We also had to turn off everything that they de-instanced so that we didn't have duplicates – you don't see a glass panel blow up and reveal an intact glass panel. All that had to be properly pipelined.

We decided early on that this thing was a little bit more magical than something that we would build if we had that kind of money on earth. When we clad it in red glass, it wasn't actually glass. It was more like a ruby with a slight metallic coat to it. That allowed us to get this really saturated, but then we discovered that the IOR (index of refraction) was a little bit wrong. We’d start to get into total internal reflections and we'd get all these black patches everywhere. Since we had all this complex geometry and the glass sometimes was two or three layers thick, with these extrusions off the back of it, it would start to just absorb light and the rays couldn't find their way back out, no matter how many rays you put in. It's a natural phenomenon. You see it in diamonds, you see it in anything that has a high IOR. But it didn't look good, so we had to find ways to cheat that. We figured out that we could affect the roughness of some of the side pieces, and that would screw up the rays enough that we'd actually always get some amount of light back into those black areas.

There were all sorts of things that we had to work our way through to make sure that, when the Arête was finally going into shots, we were happy with the results, and we weren’t constantly trying to nurse this single asset through.

AWN: Mike, how was the character animation handled, especially in the final battle in the third act? Were there ways in which this project was different from previous films?

MC: The third-act work involved character performance on an exploding ship. We have to juggle all the moving parts in work like that – not only keeping the characters on model and giving James the right performance, but all of the other elements that need to be choreographed in and around that work. Because of the complexity of the shots that we were doing, we raced ahead of some of the asset work. Typically, we build assets to final and rig them and start animating. But, in this case, we quickly assembled very work-in-progress assets and even modeled work inside animation, so we could start blocking and choreographing, and addressing questions about the composition and intent of the shots.

All the states of Groot, for example, with his big tentacle arms. He's got all these different states, and we needed to decide with James what those things looked like and how they worked before we could finalize the modeling and rigging. We worked in a very rough way, then finalized the assets, and then swung back through and finalized the animation.

One of the big chunks of work in there was the hallway fight, which was 17 shots strung together into a two-minute single shot. That was something for which we didn't have previs, so we needed to block it very rapidly in order to get James' input on the timing. The music for that was “No Sleep Till Brooklyn” by the Beastie Boys, so it needed to have action beats and choreography in the rhythm of the music. On set, Stéph and Wayne did a really beautiful job of getting the cameras in the right spot; but we had a lot of really complex stitches, like hero characters – Quill or Mantis – right in camera in the middle of a stunt where we needed to stitch from one shot to another. So, a lot of that needed to be rebuilt digitally, moving seamlessly from plate to CG and back into plate, with all the supporting action around that.

GW: I just want to add one thing. As Mike said, it's not just character animation. We rely on animation here for so much. I mean, there's not a single shot on the show that they didn't touch. They dictate the timing of everything, including the effects work. For dropping an explosion, it might seem like you just use noise to do that. But we want to make sure that, over the course of the cut, there's a sort of lyrical storytelling to everything. Mike is the gatekeeper, the shepherd, who makes sure all that visual storytelling is conveyed in an evocative way.

AWN: Mike, you talked about the various states of Groot, who’s very different in so many scenes. What do you use for reference with regard to the animation?

MC: In this iteration of Groot, James wanted a kind of happy-go-lucky character, but one that you don't want to mess with. So, we took that brief and ran with it. He has all these different things he can do with his body, so you've got a base performance that we lean into on our performance-capture stage for sketching out ideas that we can pitch to James to capture Groot's character performance. But then there’s another set of challenges – how we're going to grow tentacles out of him, or turn his arms into tentacles or claws or shields, or pop weapons out of them.

In something like the gunfight, where he has the tentacle arms – we called him “Octo- Groot,” we built the shot by matchmoving the live-action work and the camera, and took all that down to the stage so we could have a 3D representation to help the actor get the choreography right. There's so much movement in those shots that needs to be fluid and work around Quill or the other characters. So, we build it and visualize it on the stage to get different ideas for what the performance is, and then bring it back into animation to refine and pitch to James.

AWN: You worked on the sequence in which three of the Guardians are trapped in a pit with the Abilisk. How did you realize that?

MC: James wanted a horror-movie feeling for the reveal of the Abilisk. We have an internal real-time renderer that works inside Maya, so we can have real-time lighting in animation to visualize particular things – a reveal in this case. Nebula has a flashlight and they're wandering around in the pit, and she's lighting up different things; we can see that as we're animating, and time and stage things according to how the lighting's working. The great thing about the Gazebo renderer is everything is piped from animation into lighting, or vice versa. We can push lighting in either direction, and so we get the bonus of seeing the lighting work as we're blocking out and composing shots.

AWN: Was there any new tech, or any new pipeline innovations, or a new way of using anything on this show?

GW: That's a tricky one. We didn't write a specific tool just for the show, but everything was kind of a unique-use case. I mean, by its nature, effects work is almost invented every time you use it. This show was an incredibly complex in terms of effects. As I discussed, we had to write a new CityBuilder-style system to create the Arête. We know how to make a digi-doubles, but we had to make a whole bunch of them. For example, there's a guard who we see once or twice in the whole movie, but because he's standing right in front of the camera during a stitch, he had to be built as a super-high-resolution asset. There’s just an insane amount of work.

MC: We used the Koru rig in a new way. That’s the puppeting system that came out on the second Avatar film. It has continued to develop and this film is the first time that we did the tentacle rigs using that system – which, again, builds a bullet-fast animation puppet and a way of piping that stuff through the creature’s pipeline that's very efficient and a new step forward in creature rigging. Tentacles are always tricky because they need to do all sorts of things. Having rigging that can do all those different things is always complicated. Often, studios use multiple rigs for that, but the Koru rigging system allows us to do a variety of different things inside the one rig. That was very helpful.

AWN: Anything else you want to share?

GW: I just wanted to say that this show was not easy by any stretch of the imagination. And this was all done at the same time that we were working on Avatar. So, kudos to this fantastic company that we're lucky enough to work for, which has the ability to do a show like Avatar, and the third-act battle for Guardians of the Galaxy, and a couple of other shows all at the same time, without in any way sacrificing quality.

Jon Hofferman's picture
Jon Hofferman is a freelance writer and editor based in Los Angeles. He is also the creator of the Classical Composers Poster, an educational and decorative music timeline chart that makes a wonderful gift.