Search form

Framestore Untangles a Creative Web on ‘Spider-Man: No Way Home’ VFX

With a team led by visual effects supervisor Adrien Saint Girons, the studio delivers just under 250 shots, featuring the Spell Gone Wrong and Doctor Strange/Spider-Man chase sequences, on Marvel’s hit superhero action-adventure film, just nominated for a Best Visual Effects Oscar.

Despite the need for secrecy during the production of Marvel’s smash action-adventure hit, Spider-Man: No Way Home (just nominated for a Best Visual Effects Oscar), Framestore was not majorly impacted, as the studio’s work involved two self-contained scenes: the Spell Gone Wrong and the Doctor Strange/Spider-Man chase.  “The only thing that was important for us to share with other vendors was the ancient box that Spider-Man takes from Doctor Strange and starts the whole chase sequence,” states Framestore VFX supervisor Adrien Saint Girons, who was responsible for delivering just under 250 visual effects shots.  “It was a full pandemic show for us.  As far as efficiency and work it was a good flow and communication. Dailies were happening in Zoom meetings and we have our internal mechanism to be able to share content.”

The project called for Framestore to develop concept art as well as produce postvis, especially for the Spell Gone Wrong sequence.  “The look of the spell itself was completely open,” remarks Saint Girons.  “When the whole room explodes, that was something happened in the process of doing the postvis. For the chase, there was early previs that described the general idea but it evolved a lot.  The first version I saw only went through New York City.  At some point the Grand Canyon came in.  We were there for the whole journey trying to figure it out along with [Marvel Studios VFX supervisor] Kelly Port and [director] Jon Watts.”  The sequence requited two separate types of work. “There were new things that we had to figure out, and there was stuff that was done in the first Doctor Strange or already defined in previous Marvel movies that we would have to do the same but better.”

 

Framestore worked on the box more than any other asset. “We had to make it feel like a ancient relic which is a puzzle that needs to be solved,” states Saint Girons.  “Then we had to figure out the materials and inscriptions.”  It was important to have a proper ratio of jade, metal and wood.  “The first iteration had the spell being separate and the artefact being the thing that Doctor Strange solved and sent people back to their dimensions. Originally, the spell was runes scattered about the room along with a neuro network being created as the spell is being written.  Very quickly it got to be visually confusing.  We came up with the idea of the simple runes that get written and every time Doctor Strange throws them away, they start shaking and become less stable throughout the process of the spell gone wrong.  There were some rune inscriptions on a basin that was on-set and we used those as our basis.  But then our compositing lead made the final alphabet.”

In dramatic Doctor Strange fashion, the room explodes.  “When the spell goes wrong and the room explodes, that is more of a traditional explosion,” Saint Girons explains. “But then we had to make sure that there was a nice vortex feel and have all of the objects rotating within this nebula space. We had to make sure that there was a lot of depth.  Some objects are close and others faraway to create the impression of infinite space.” The plates were shot within the basement with strobe lights.  “For the explosion, it did not make sense to have as much flashing, so we had to rebalance it all,” he adds, noting the pull back shot of the entire nebula came about accidentally.  “The camera was supposed to be static but we did a wedge move backwards which they loved.  We still kept the plate for the characters in the centre but because we had them tracked it all worked out.”

The Mirror Dimension consists of three parts.  “There is the New York section where it is deforming and kaleidoscoping,” Saint Girons shares. “Then there is New York and Grand Canyon mixed together.  And finally, the crazy kaleidoscope stuff at the end.  We made a good-looking New York not bending.  The environment team made a procedural method to build New York off of OSM data to get the footprint and height of the buildings.  We had a library of different architectural styles to fit the correct streets. On top of that were procedural textures and shaders.  Every building around Central Park had to be correct.  Sections were sent to the rigging team so that they could be warped, twisted, and made into ‘u’ shapes. The animation team animated sections per shot so it moved the way we wanted it to move.  The effects team would take all of that data and apply their kaleidoscope effects on top of it.  We had a whole pipeline just to get from the beginning to the end which involved a lot of departments and new automated processes of passing things from one department to the other.” 

A hybrid of New York and Grand Canyon was created.  “We wanted to use the same New York assets as much as possible so there was parity from the first section to the next,” Saint Girons remarks.  “Then we built a photoreal Grand Canyon on top of that by using Google Data as a base and procedurally upping the resolution and scattering as much realism into it.”  The effects department made a kaleidoscope version of the assets for when two separate portals collide together.  “Our layout team would grab these islands of New York and Grand Canyon to create these enormous environments that go on the top and bottom.  On top of that there was a spiral rig so we could have these rotating elements and artists were able to draw whatever shape they wanted using a curve tool.  There were a lot of specific bespoke tools to generate that.  The important thing was that we kept all of the things that we built from the beginning all the way to the end.  They were the same assets but managed differently.”  The dynamic shots were entirely CG.  “Even the closeups of Spider-Man are CG because his lines changed and it was easier to get the compelling lighting on him to be consistent.  The closeups of Doctor Strange were plates shot against bluescreen on top of the fake train.”

Digital doubles were produced for Doctor Strange and Spider-Man utilizing a traditional methodology.  “The amount of time the team spent drilling down on the rig and musculature made the accuracy anatomically very good for Spider-Man,” states Saint Girons.  “We build from the inside out.  We have bone and muscle structures as well as fat and skin on top of that. And on top of that we have the cloth.  Everything is accurate.  As far as matching Tom Holland, he also put on a lot of muscle so we had to match that.  We were constantly looking at references and getting it as close as possible.”  Doctor Strange had less features so did not require the same amount of scrutiny from the rigging team. “Once we showed our final renders to Jon Watts, he was like, ‘Wow!’” 

The biggest challenge was the variety of the visual effects work.  It was so broad.  “Magic is so open-ended,” Saint Girons concludes. “To bring it to where it landed in the film was a big undertaking.  I was lucky to have Christian Kaestner and Alexis Wajsbrot also involved with this movie and help me figure out some of these creative things.” 

Trevor Hogg's picture

Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.