Search form

'Real Steel': A New Virtual Production Paradigm

Bill Desowitz finds out how Digital Domain and Giant Studios took virtual production to the next level.

Atom on the set and in the movie. All images courtesy of Digital Domain.

Real Steel, the new boxing robot movie, takes the Simulcam developed for Avatar and puts it into a real world setting for the next advancement in virtual production. Giant Studios, under the leadership of Matt Maden, the virtual production technical supervisor, came up with a new system for a new paradigm.

"It really worked beautifully for us with production and with Digital Domain," Madden suggests. "We spent the time upfront figuring out how the pieces would fit together and how we would communicate. It's a model we're going to be referring to time and time again moving forward."

Unlike previs, Giant knew they we were ultimately going to be delivering a form of the movie back to DD in a game-level quality in terms of rendering. But the action itself was going to be fairly close to final, with the exception of the additional animation layer and effects that they would be putting on top of it with the electronics and liquids and ripping metal.

Before

The virtual production pipeline allowed the CG robots to be handled closer to a live-action shoot.

One of the main principles of this virtual production pipeline is that Giant was in sync with the visual effects department and its department in terms of the look and structure of the assets that were created. So there's an approval process and Giant knew whether it was from the art department or VFX.

"Our responsibility is to get them ready for this interactive world of virtual production so we can play them live, we can record changes, we can add new versions of prop elements, if we need to change a lighting setup, we can do that, and all those things can be recorded and referenced in a data base so the visual effects department can access that information intuitively," Maden continues.

What was helpful with Real Steel, however, was that director Shawn Levy completely bought into this process. "The whole MO is to make it more like traditional filmmaking and make it interactive like live action," Maden emphasizes. "Only we come in with a real time display of the CG elements.

Before

Director Shawn Levy could direct his virtual boxers.

"He was able to direct his fighters, which were ultimately the robots, prior to location in Detroit. And then, once he reviewed the cut from our renders at the virtual production level, he could then request changes to speed or the blocking or the timing of a punch, and we made those changes and submitted updated renders back to him to lay into the cut. He and the editor [Dean Zimmerman] and the producers were happy with the general action and timing of the fight prior to going to location. And, consequently, we were able to get through those fight scenes in record time because we were armed with that prior to photography."

Everyone involved in the physical set up got to review the process as well, not just as boards, but as an actual cut. So they understood upcoming beats, they understood the coverage and they understood where the camera was and what was in the background. And what wasn't in the background. Maden says it helped across the board.

But Giant significantly took the Simulcam process of simultaneous CG display to the next level. It wasn't just cranes and dollies; there was quite an extensive use of steadicam. But it required Giant to have a system that was robust enough to record this fast-moving, dynamic camera action.

Before

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags