Search form

Luma Pictures Takes Molten Man and Cyclone into the Cloud on ‘Spider-Man: Far From Home’

VFX producer Michael Perdew shares how Google Cloud was integral in the studio’s Air and Fire Elementals production work on the hit superhero action adventure.

In Columbia Pictures and Marvel’s Spider-Man: Far From Home, our young webslinging hero, Peter Parker, leaves his friendly New York City confines for a school trip to Venice, Prague, Berlin and London (but not Paris). While working on the film’s visual effects, Luma Pictures left the friendly confines of its on-premises Los Angeles data center, moving its render pipeline to Google Cloud to help them create the movie’s Air and Fire Elemental characters, Cyclone and Molten Man.

For Luma VFX producer Michael Perdew, the studio initially wasn’t sure the cloud would be a good fit for its Cyclone and Molten Man production work. “The big technical and creative challenge here was that both of these characters were simulations,” he explains. “They both required a traditional Maya rigging and animation approach to convey character and story, but once the performance was approved, we had to reinterpret the actions into believable simulations of the natural phenomenon.”

According to Perdew, “For the case of Molten Man, the first step was to use Ziva to add muscle and skin jiggle to the animated mesh. That mesh was pulled into Houdini to add a simulated ‘flow.’ This flow simulation would displace the surface, creating flowing rivers of molten lava. That flow would then in-turn drive additional Houdini simulations of fire, embers, smoke and dripping lava. On top of that, we created a feedback loop, to quickly control the speed and scale of the secondaries in Ftrack based off simple variables in the setup.”

“On Cyclone, the geometry was essentially converted into a volume of noisy clouds and a high-resolution noise field was run through it to create a directionality to the flow in Houdini,” he continues. “This base layer was deformed into position to cover the basic performance. From that base, we used Houdini to simulate very high-resolution secondary fields for wispy cast-off vapor and particles which would ‘react’ to the animated motion. These additional layers gave us both detail and scale. On top of that, we built a procedural lightning rig in Houdini and simulated large particulate volume which would swirl and spin around Cyclone and Mysterio, creating depth and texture to the enormous space of the Training Theater.”

For both characters, all the layers were exported into a USD container for easy loading and version control. The USD was then loaded into templated Katana scene for lighting and shading, and rendered using Arnold.

Historically, simulations took too much CPU, bandwidth, and disk space to be rendered in a time or cost-effective manner outside of a local compute farm. Syncing terabytes of cache data from on-premises to the cloud can take several hours if you have limited bandwidth. In addition, Luma hadn’t yet found a cloud-based file system that could support the massive compute clusters needed to render simulations.

But, with such a big job, “we had to find a way to render more than our local farms could handle,” Perdew reveals. So, he and his team put their heads together and developed a workflow to make it work in the cloud. Having used the Google Cloud Platform (GCP) previously, though relatively easy to get back up and running, it was nevertheless challenging for Luma to ensure proper integration of new pipeline upgrades and system updates they’d implemented since their last big cloud project.  “The difficult parts [of optimizing the new workflow] generally related to airing out any points of failure from our own internal pipeline updates,” he describes. “Since our last heavy GCP show, we've had several updates in our pipeline to both Katana, Arnold, and some customizations to the version of Katana to Arnold that we use. Additionally, we've heavily invested in a full USD pipeline that flows from Maya through Houdini and into Katana. In the cases of both Molten Man and Cyclone, it was essential to make certain that every cache, particle, and .usd descriptor file made it up to the cloud to ensure proper renders were coming back, yet we wanted to keep the amount of items we synced limited to only the essentials so that bandwidth wouldn't be a concern.”

“At the end of the day, we attacked it from both sides, but investing more heavily in ‘smart syncing’ tools that would more intelligently gather only the essential dependencies, while also triggering the syncs during our off hours so that not a minute was ever wasted,” he adds.  

As it turned out, moving to the cloud turned out to be the right move for the studio. In Google Cloud, Luma leveraged Compute Engine custom images with 96-cores and 128 GB of RAM, and paired them with a high-performance ZFS file system. Using up to 15,000 vCPUs, Luma could render shots of the cloud monster in as little as 90 minutes, compared with the 7 or 8 hours it would take on their local render farm. Time saved rendering in the cloud more than made up for time spent syncing data to Google Cloud. “We came out way ahead, actually,” Perdew says.

Leveraging the cloud also pushed Luma to get savvy with their workflow. By breaking up the cloud monster simulations into pieces, Perdew’s L.A. and Melbourne, Australia teams could work around the clock, taking advantage of Google’s global fiber network to move their data where needed. While the L.A. team slept, VFX artists in Luma’s Melbourne office tweaked animations and simulation settings, and triggered syncs to the cloud, getting the updated scenes ready for the L.A.-based FX and lighting teams. When L.A. artists arrived in the office the following morning, they could start the simulation jobs in Google Cloud Platform (GCP), receiving data to review by lunchtime. 

In the end, Luma produced around 330 shots for Spider-Man: Far From Home, with roughly one-third of those created in the cloud. In addition to creating Cyclone and Molten Man, Luma designed Spider-Man’s Night Monkey suit, created an elaborate CG environment for the Liberec Square in the Molten Man Battle scene, and collaborated on destruction FX in Mysterio’s lair sequence.

With Luma’s Spider-Man work completed, the studio has begun ramping up to take advantage of other GCP features. For example, its artists use an in-house proprietary tool called Rill that automates the process of seeing updated character animations through full simulations and render. This tool is currently deployed on an on-premises Kubernetes cluster, which they are exploring migrating - as well as other tools - to Google Kubernetes Engine (GKE) in the cloud. “Having more day-to-day services in the cloud will have all kinds of reliability benefits,” Perdew states, including, for example, protecting them against the planned power outages that occasionally happen in Luma’s Santa Monica office.

Luma also plans to install a direct connection to the Google Cloud Los Angeles cloud region for future productions, more bandwidth, and reduced cloud transfer latency. The team hopes this will open the door to all kinds of possibilities, including the use of remote workstations, which Perdew is excited to try out. “The industry keeps on changing the type of computer you need per discipline to do good work,” he says. “Having the flexibility to upgrade and downgrade an individual artist on the fly…as a producer, that makes me giddy.” 

“In the future, we plan to replace any rental needs with remote workstations,” Perdew concludes. “More interestingly though, there are often short-term needs to equip our artists with higher configuration machines for as little as a day, and we want to add that into a more organic workflow. Depending on the type and scope of a project, lighting or compositing artists may need significantly more CPU power, RAM, or even a higher end GPU. We think that remote workstations will allow us to naturally swap out the machine power with the needs of our artists, without the need for costly purchases to fill a short-term need.”

To learn more about the use of Google Cloud in the entertainment industry, swing on over to their Rendering Solutions page.

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.