Search form

Jerome Chen Talks Real-Time Animation with Unreal Engine

The Sony Pictures Imageworks senior VFX supervisor discusses his studio’s integration of Epic’s game engine technology and real-time workflows into its animation production on a new episode of Netflix’s ‘Love, Death & Robots’ anthology series.

As we learn more about innovative ways Epic’s Unreal Engine is being used in film and TV production, we discover that new real-time production tools, coupled with emerging virtual production methodologies, are proving more and more valuable to studios across a wide range of animation and VFX projects.  

Today’s release of Epic Games’ Virtual Production Field Guide Volume 2 provides studios and artists a new, free resource for more in-depth methodologies, case studies, and technical guidelines that highlight numerous ways to integrate real-time technology into their film and television projects. Available as a PDF download, the new volume goes in-depth into areas such as workflow evolutions including remote multi-user collaboration, new features released in Unreal Engine, and what’s coming this year in Unreal Engine 5; it also includes case studies and two dozen new interviews with industry leaders about their hands-on experiences with virtual production.

For Jerome Chen, an Oscar-nominated senior VFX supervisor at Sony Pictures Imageworks, Unreal Engine has become an invaluable tool, allowing real-time animation production and review in ways that have fundamentally changed his creative process.

Chen has worked on many high-profile film and TV VFX projects including The Polar Express, The Amazing Spider-Man, Suicide Squad, Fury, and Jumanji: Welcome to the Jungle. Chen also directed “Lucky 13,” a 2019 episode of the Netflix animated anthology series Love, Death & Robots. Currently in production on a Season 2 Love, Death & Robots episode, he’s using Unreal Engine for the first time, successfully integrating real-time technology into animation production in ways he says he’ll stick with moving forward. Below, he talks about how Unreal Engine has transformed the way he works.  

AWN: What projects are you using Unreal Engine for?

Jerome Chen: We’re using it for an episode in the second season of Love, Death & Robots, the Netflix anthology produced by Tim Miller and David Fincher. The episode is all CG in the style of high-end game cinematics with human characters and creatures. Our front-end workflow involves Maya, ZBrush, and Houdini, but the commitment is to try to stay in Unreal Engine as much as possible and break out where it makes sense. This is our first foray into Unreal, and we're very excited about the possibilities.

AWN: What led you to Unreal Engine versus your more established visual effects pipeline?

JC: I've had an interest in real-time technology for decades and tried to introduce it into visual effects for features and animation. I've known Epic's David Morin for a few years, and we just started talking more and more. When Love, Death & Robots came around to me, it was a perfect opportunity to try Unreal Engine and see what it offered. I had no experience with it before this, other than seeing some very compelling demos and videos.

AWN: Has Unreal Engine lived up to your expectations?

JC: It’s definitely lived up to the hype as a very powerful and deep set of tools. The project we're doing is pretty ambitious, even for Unreal Engine. So, we're going to finish it first and then do a postmortem and find access points for the engine in our pipeline. Ingesting Maya animation into Unreal for high-quality lit dailies is one of the first things that comes to mind. There will be many opportunities that will make themselves available after we've delivered and performed a postmortem of the process. 

AWN: Did you need Unreal Engine specialists, or was it possible to work with in-house artists?

JC: Given the timeframe, we did a poll of our in-house artists to see who had used Unreal Engine before. It turned out we had probably two dozen people with experience using it at other companies for games and animation. They're now on the team for the project, and we also have very close support from Epic to help us along our journey.

AWN: Did you need any significant equipment upgrades to implement Unreal Engine?

JC: We assessed with Epic and our systems and engineering group, and luckily, there were only a couple of minor changes needed. We updated our graphics to RTX cards and added some larger, solid-state drives. It wasn't that different from what the systems group was already intending to acquire.

AWN: How does the real-time mindset compare to traditional computer animation?

JC: The techniques and knowledge set are perfectly applicable, but the workflow is completely different. Everything works in parallel and then merges itself into the engine, which helps each step of the production be much more successful. In a more traditional serial pipeline, you do your animation before assets are fully done, and you do lighting after you approve animation. Your effects come after you approve animation.

We become myopic at each step in a traditional pipeline. In the Unreal Engine real-time workflow, things merge much earlier. You're able to see animation in your environment with lighting happening at the same time. For creative collaboration and decision-making, that's one huge difference. 

AWN: What do you see driving the increased interest in Unreal Engine for visual effects?

JC: It has to do with a curve of maturity. Everything that Epic has done through the years, culminating with Fortnite, has given them a tremendous amount of resources and allowed them to reinvest back into the engine and now really make it available to people. I also see a concentrated effort to get knowledge about it out there.           

AWN: Do you have any career advice for aspiring artists?

JC: The traditional fine arts and theories about cinema will always be true. This technology enables you to do amazing things, but the roots come from the traditional core understanding of composition and light and all the theory dating back to the Renaissance days. You can't shortcut that when you begin your foray into the creation of synthetic imagery using digital tools. These tools always evolve, but your core artistic understanding needs to be based on tradition. 

AWN: Has the need for remote collaboration affected your workflow?

JC: Absolutely. I started this project during lockdown, so pre-production was all done virtually. I'm in Los Angeles, my team's in Vancouver, and yet I'm in the engine with them eight hours a day. Whether we're doing modeling or environment review, I go into Unreal Engine with the artists. If I need to move something around inside the engine, I can manipulate objects and adjust parameters. The collaboration has been almost transparent and very productive.

You can also give a client or any person an executable file to review, and they don't need Unreal Engine. They launch the executable, log in to an IP address, and we're in an environment together. The studio gets a very immersive experience reviewing an asset, and they love it. That level of collaboration is very interesting, involving the remote participation of reviewing work in progress.

AWN: Do you foresee sticking with Unreal Engine and real-time technology for future projects?

JC: I don't intend to go back. The speed you can work at in terms of visualization and decision-making is world-changing for me. It’s easy to embrace; it comes down to how inventive you are about designing new workflows embracing Unreal Engine. If you're not using real-time, you're behind the curve.

The renderfarm workflow is not going to go away but having real-time in parts of your process is going to become a necessity for a variety of reasons. We're going to be adopting new habits and workflows. Because everyone's not able to work together for the time being, not implementing real-time will slow down your decision-making process. It's not just the real-time tools that make Unreal Engine so powerful, it's also the collaboration tools that allow us to work virtually. 

Our project also involves performance-capture, and we're recreating the likenesses of some of the actors. For the performance capture sessions, I'm going to be driving the real-time characters so we can visualize it with the sets for the actors. They'll be in a performance-capture volume, but they’ll see themselves as soldiers in an Afghan tunnel system. 

That type of visualization will enhance their performances and give them ideas they might not have thought of otherwise. That's very powerful and compelling. It's going to have a significant impact on how we do things in ways that we're not even fully aware of yet. People may not be able to articulate exactly how it will affect the way we work.

I compare real-time to a finely tuned hot rod. It runs great, and it's amazing, but when it breaks, it's often due to something obscure. By comparison, the images Unreal Engine makes are amazing, but it can also be technical. For me, that’s balanced out by its performance, the results you're getting, and the shortcutting in the creative process it offers.