Executive producer Julia Parfitt and director Patrick Osborne talk about how they brought a real-time animated Grog and Scanlan to the Season 1 Finale Watch Party for the hit Prime Video series.
In a groundbreaking live event that delighted fans and opened up new possibilities for the world of real-time animated performance, Nexus Studios teamed up with Amazon Studios to host a late-night talk show for the adult animated fantasy series The Legend of Vox Machina. As part of the Season 1 Finale Watch Party, cast members Travis Willingham and Sam Riegel voiced and digitally manipulated two of the show’s animated characters, Grog and Scanlan, who responded to questions from viewers in real-time. The segment was directed by Oscar and Annie winner Patrick Osborne (Feast, Pearl).
For the tiny minority who might be unfamiliar with the series, the show had its origin in the web series Critical Role, in which a group of professional voice actors played Dungeons & Dragons. Then, following the most successful Kickstarter campaign in the history of the universe, The Legend of Vox Machina was born, morphing from a single animated special to a two-season initial Prime Video commitment; Season 1 premiered this past January 28. The series was based on the first campaign of Critical Role, with the series stars reprising their roles.
Nexus is convinced that the success of the live segment could revolutionize the way they produce content. It demonstrated the viability of a real-time pipeline through which popular animated characters can be brought to life to interact with fans in a totally unscripted way, cutting out lengthy offline rendering and weeks of production.
“Real-time digital puppetry promises to transform the production of linear animated content in years to come, allowing more creative freedom, spontaneity and accessibility for creators with more iterative turnarounds,” says Julia Parfitt, Nexus Studios executive producer. “We are super proud to have pioneered this technology and enabled Grog and Scanlan to entertain their cult fandom with a unique unscripted performance.”
Check out a teaser from the show:
In realizing the milestone, the team took motion-capture beyond its typical capabilities by developing a system to stylize movements in real-time, thus allowing actors to match the intent of their performance with the expressiveness of animation. Further, a suite of rendering tricks were used to match the live 3D performance with the series' 2D aesthetic.
Osborne, who previously collaborated with Nexus on Grammy Award-winning artist Billie Eilish’s concert film Happier Than Ever: A Love Letter to Los Angeles, says that, while the techniques used in the Legend of Vox Machina segment represented a significant advance, they built on well-established procedures.
“I try not to re-invent the wheel too many times when working with emerging techniques,” Osborne elucidates. “The basic construction and rigging technique was standard in a Maya pipeline, only with custom blend shapes tailored to work with ARKit-style facial capture. Where we needed to invent was in emulating the look and feel of the original Vox Machina show in a real-time environment. This included custom shaders, lighting techniques, and interpretation of motion-capture data.”
“We had a small team of around five or six artists and developers working closely with Patrick,” adds Parfitt. “After a short development period of a few weeks, we built the tools and developed the inertial mo-cap inputs. After translating the 2D characters into Maya models, we did the look development and shading in Unreal. Once we were happy with the system, we did the live setup with the actors.”
Asked how the challenges of producing animation for a live, real-time performance compared with other animation that he’s directed, Osborne says that his past experience stood him in good stead.
“We were using core animation principles to create a sort of living virtual costume for an actor to wear,” he explains. “I wasn’t looking to translate the motion literally, but to caricature it in an appealing way. It was a challenging and fun puzzle to use my years of animating experience to actively shape the performance of an actor procedurally into an appealing ‘living’ character onscreen. This project was 90% preparation, and 10% nail-biting nerves as we went live and hoped the preparation paid off.”
“Creating a live animated real-time performance is still in its infancy in terms of getting the best performance,” Parfitt elaborates. “The process is very different, as a great deal of the work was spent coding and developing the software to produce the best performance. The skill was in getting the characters to perform in a way that was not uncanny valley and had an authentic animated outcome.”
As for the future of this kind of real-time animation, Osborne, like Parfitt and Nexus, is bullish.
“There is something thrilling about knowing a performance is live,” he enthuses. “I’m fascinated by the intersection of animation, puppetry and performance that modern real-time tools allow, and can’t wait to see where we might take it.”