Search form

Mold3D Shares Windwalker Echo Unreal Engine Hero Character

Available now for download and free use in UE 4.27 and UE5 Early Access, the nimble 3DCG warrior stars in studio’s upcoming content sample project, ‘Slay,’ which will be downloadable later this month.

Mold3D Studio has just shared their 3DCG Windwalker Echo hero character from the recent Unreal Engine 5 reveal video, Lumen in the Land of Nanite, available for immediate download and use in Unreal Engine 4.27 and Unreal Engine 5 Early Access. The warrior character, who was also part of the Valley of the Ancient project that accompanied the UE5 Early Access release, is also starring in Mold3D’s new content sample project, Slay, which will be shared with the Unreal Engine community later this month. The Slay trailer, rendered entirely in UE4, can be found below.

Led by CEO Edward Quintero, Mold3D created the sample project to explore animation and virtual art department techniques aimed at film and TV content, then share their work with the Unreal Engine community. The project contains everything needed to create the video trailer, and is available to use in projects free of charge.

Quintero, an ILM and DreamWorks Animation veteran whose credits include The Matrix trilogy, Avatar, and The Mandalorian, formed Mold3D back in 2016 after becoming interested how real-time technology could transform content creation. At the time, he was collaborating with Epic on Unreal Engine projects such as Paragon and Robo Recall.

“Real-time technology struck a chord with me, and I thought that's where I should focus my energy,” he says. “I felt like it was the future, because I was able to visualize 3D art in real-time, instead of the days and weeks that traditional rendering required.”

With some Unreal Engine experience now under his belt, Quintero joined Fox VFX Lab, where he was asked to head up their new VAD (virtual art department) and build a team. In these early days of virtual production, he and his team were using Unreal Engine to create pitches for films and to visualize environments for directors, enabling them to do virtual scouting, and to set up shots, color, and lighting that were then fed to the visual effects vendor where they would finish the film.

After his time at Fox VFX Lab, he and his team became part of the VAD for The Mandalorian. “It was the foundation of us starting up a studio solely devoted to the art of being a real real-time studio,” he says. “I was trying to build for what I saw that was coming—the future of visual effects. We could all feel that this was happening.” 

Soon, Mold3D Studio was invited back to join the VAD for The Mandalorian Season 2. At the same time, the studio also began work on Epic’s Unreal Engine 5 reveal demo, creating extremely complex and high-resolution 3D models to show off Nanite, UE5’s virtualized micropolygon geometry system. 

“It was exciting to get a taste of what’s coming in UE5 in the future,” says Quintero. “We’re currently using UE4, until UE5 is production-ready. There have been some great advances in the latest UE 4.27 release—especially in the realm of virtual production—but features like Nanite and Lumen are really going to change the game.”

After the UE5 demo was completed, Quintero began talking to Epic about Slay. The proposal was to create a finished piece of final-pixel animated content in Unreal Engine. With the company now starting to get a name for environment art, they were excited to illustrate their expertise in story development and character design. With the exception of Windwalker Echo, the Slay assets, including her adversary, were all designed and created by Mold3D.

Just as Slay was greenlit, the pandemic hit. Mold3D set up a remote working environment that would enable them to work on real-time rendered animated content, as well as other projects that they had on their books. And virtual production made it all possible; with mocap happening in Las Vegas, the team in Burbank directed actors via Zoom, while viewing the results on the characters in real time in Unreal Engine, making it easy to ensure they had the takes they wanted.

“Although we probably would have done a lot of things the same way we had if there was no pandemic, we were thankfully able to rely on the virtual production aspect of the filmmaking to save the day,” Quintero shares.

After the main motion was captured, the team did a second session with the actor just for facial capture. For this, they used the Live Link Face iOS app, looking at takes with the recordings that came from the iPhone as well as the camera looking at the actor. 

Head over to Epic Games’ blog post, which goes into great detail about the Slay project, including look development, terrains, lighting, and real-time rendering that powered iteration and visualization of results in minutes rather than days.  

Source: Epic Games

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.