Search form

Xsens Helps The Imaginarium Go High-Tech for ‘The Tempest’

The Imaginarium Studios partners with the Royal Shakespeare Company to create a digital avatar projected in real-time utilizing Xsens motion capture technology.

The NETHERLANDS -- For the first time since its debut in 1611, William Shakespeare’s The Tempest can be seen onstage with all the wonder and magic that the author dreamed of, thanks to a new collaboration between the Royal Shakespeare Company and Intel in association with The Imaginarium Studios. The performance includes the first use of a completely digital character in an RSC production, made possible through the use of Xsens‘ real-time MVN motion capture technology.

The Tempest, one of Shakespeare’s final plays, tells the story of Prospero, an exiled magician who decides to settle old scores when his rivals sail past his solitary island. With the help of his magical servant Ariel, Prospero forces the ship to his shores, ready to right old wrongs through magic and cunning. One of the unique parts of Ariel is that the character features more stage directions than almost any other in Shakespeare’s bibliography, making it one of the most complicated to stage.

“For this performance to work, Ariel needed to fly, walk in space and collaborate with other performers in the moment,” said Ben Lumsden of The Imaginarium Studios, the production company founded in 2011 by filmmakers Andy Serkis and Jonathan Cavendish. “Without unrestrictive performance capture technology like MVN, Ariel would have been just another landlocked cast member in a costume.”

During the performance, Ariel morphs from a spirit to a water nymph to a harpy. This transformation is achieved by RSC and The Imaginarium Studios capturing the movements of actor Mark Quartley through Xsens motion capture sensors placed within the actor’s costume. When Ariel transforms into something more than human, the actor’s movements are projected onstage and in the air as a digital avatar. The flexibility of the Xsens technology enables the actor to interact directly with cast members in human form, while being able to transform live on stage every night during the play’s run.

To achieve this transformation, The Imaginarium Studios and the RSC used an Xsens MVN system to track the actor’s performance. The data is run through Autodesk’s MotionBuilder software, and from there into Epic’s Unreal Engine 4. The video output is then sent to d3 servers powered by Intel’s Xeon processor connected to the RSC lighting desk, which in-turn controls 27 projectors located around the stage.

“Inertial motion capture is changing how far productions can push their craft, bringing high-end digital characters into live shows,” said Hein Beute product manager at Xsens. “With The Tempest, the RSC is creating a real-time application that is both immediate and novel, something audiences always want to see on their nights out.”

Theater is now entering an era where characters and scenes can be presented in ways that are more visually engaging, and in many cases, far beyond what even the authors originally imagined. Or as Sarah Ellis, the RSC’s head of digital development, told The Guardian in September: “To be able to create digital characters in real time on [this] scale in a theatrical environment is a huge achievement.”

The RSC production of The Tempest runs at the Royal Shakespeare Theatre in Shakespeare’s hometown of Stratford-Upon-Avon. The show runs until January 21, 2017. Tickets are available now at www.rsc.org.uk.

Source: Xsens