The leading extended reality tech firm and Xite Labs create a myriad of immersive worlds in Unreal Engine for the rappers’ ‘Trapsoul World Series’ performance.
Xite Labs, integrating the disguise Extended Reality (xR) workflow powered by a gx 2c media server and rx real-time rendering platform, brought a host of virtual worlds to life in Unreal Engine for rapper Bryson Tiller’s recent Trapsoul World Series live-streamed concert.
Shot at Xite Labs’ xR stage in L.A., the concert accentuated the artist’s unique musical vibe for fans around the world watching at home. The immersive live stream presented Tiller in a series of six different worlds linked by a narrative flowing through the songs. Xite Labs was responsible for the xR content on 14 different songs performed in four virtual “worlds” with distinct appearances and themes.
Xite used its in-house workflow, featuring a disguise gx 2c media server as the primary xR environment controller, while a dedicated disguise rx machine was used to run the Unreal Engine scenes via disguise RenderStream. Front-plate elements were created in Notch to further link Tiller into each of the unique worlds.
The concert included a virtual lounge that fell away to reveal a world of galaxies, nebulas, and spaceships, a time theme with a mountain desert landscape, and a flight through a moonlit sky, guerrilla warfare transforming into a neon jungle as well as stark hallways with bold, flat lighting, color-changing walls and silhouettes. Throughout, Tiller appeared to perform on a moving platform, which served as the anchor point transporting him from one environment to another.
“I have not seen anything done in xR that was quite as diverse and complex as this,” said Greg Russell, creative director at Xite Labs. “And the fact that it was shot on our smaller volume in such a short timeframe still blows my mind.”
An unforeseen benefit of the disguise xR workflow was it allowed Tiller to be filmed wearing a shiny, black reflective jacket for the interlude performances. “This would have been incredibly challenging on a greenscreen and require a great deal of time technically for lighting and ensuring the separation of light fields,” said Vello Virkhaus, creative director at Xite Labs. “But xR made this possible, and it looked amazing.”
The ambitious production required over 400 hours of development from nine Unreal Engine artists across 10 weeks, and over 200 hours each for two Notch artists, to bring the virtual environments to life. “From the top-down, the live-streamed concert went exceedingly well and got great feedback,” said Russell. “Bryson understood the technology and intuitively knew where to be on the stage and how to be in and out of the lighting.”
“On the production level, it was the first-time director Mike Carson, DP Russ Fraser and producer Amish Dani had done xR,” Russell said. “With film and music video people coming into our world, it was a very challenging job from a production standpoint because we were basically teaching them the xR workflow on the job. But once they started putting the pieces together, they realized its value.”
Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.