Search form

‘Gods of Mars’: Virtual Production and NVIDIA RTX Real-Time Graphics at Work

Production of epic space adventure combines miniatures and real-time technologies at a fraction of the cost of similarly designed and animated blockbuster feature films.

Over at Nvidia, Nicole Castro has posted an excellent write-up on how director Peter Hyoguchi and producer Joan Webb’s epic space adventure, Gods of Mars, is being produced with a game-changing combination of real-time rendered graphics, virtual production, and miniatures.

The movie, currently in production, tells the story of a fighter pilot who leads a team against rebels in a battle on Mars, a planet now filled with cities after decades of terraforming. The project features a mix of cinematic visual effects with live-action elements to bring intergalactic scenes to the big screen.

The film crew originally planned to make the movie primarily with miniatures but switched gears once they were introduced to real-time NVIDIA RTX graphics and Unreal Engine.

Hyoguchi and Webb, working from an Epic MegaGrant, brought together experienced VFX professionals and game developers to create the film. The virtual production started with scanning the miniature models and animating them in Unreal Engine. “I’ve been working as a CGI and VFX supervisor for 20 years, and I never wanna go back to older workflows,” said Hyoguchi. “This is a total pivot point for the next 100 years of cinema — everyone is going to use this technology for their effects.”

Check out a behind-the-scenes look at how the filmmakers are harnessing virtual production and real-time rendering to produce their film:

Hyoguchi and team produced rich, photorealistic worlds in 4K to create futuristic scenes with a combination of NVIDIA Quadro RTX 6000 GPU-powered Lenovo ThinkStation P920 workstations, ASUS ProArt Display PA32UCX-P monitors, Blackmagic Design cameras and DaVinci Resolve, and the Wacom Cintiq Pro 24

The film’s live-action are supported by LED walls with real-time rendered graphics created from Unreal Engine. Actors are filmed on-set, with a virtual background projected behind them. To keep the set minimal, the team only builds what actors will physically interact with, and then uses the projected environment from Unreal Engine for the rest of the scenes.

Check out more behind-the-scenes breakdowns of the film’s production in this “Creating a Universe” video:

One big advantage of working with digital environments and assets is its real-time lighting. When previously working with CGI, Hyoguchi and his team would pre-visualize everything inside a grayscale environment. Then they’d wait hours for one frame to render before seeing a preview of what an image or scene could look like. With Unreal Engine, Hyoguchi can have scenes ray-trace rendered immediately with lights, shadows, and colors. He can move around the environment and see how everything would look in the scene, saving weeks of pre-planning. 

Real-time rendering also saves money and resources. Hyoguchi doesn’t need to spend thousands of dollars for renderfarms or wait two weeks for one shot to complete rendering. The RTX-powered ThinkStation P920 renders everything in real-time, which leads to more iterations, making way for a much more efficient, flexible and faster creative workflow. 

“Ray tracing is what makes this movie possible,” said Hyoguchi. “With NVIDIA RTX and the ability to do real-time ray tracing, we can make a movie with low cost and less people, and yet I still have the flexibility to make more creative choices than I’ve ever had in my life.”

Head over to the Nvidia website to learn more about the project and underlying technology.

Source: Nvidia

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.