Search form

A Night at the High-Tech Opera

Production Designer Alex McDowell gets immersive with Death and the Powers.

Check out the Death and the Powers clip at AWNtv!

How to transmute body and soul into a machine? Images courtesy of MIT Media Lab.

Leave it to Alex McDowell to invigorate an opera with visual ingenuity, cutting edge technology and the best possible immersive experience. That's what occurred last month with the premiere of Death and the Powers in Monte Carlo.

The stripped-down, one-act opera by composer Tod Machover and developed at the MIT Media Lab with the American Repertory Theater, tells the story of Simon Powers (James Maddalena), brilliant businessman/inventor, who transcends the boundaries of humanity by passing from one form of existence to another to project himself into the future.

Thus, Death and the Powers, directed by Diane Paulus (the acclaimed revival of Hair in Central Park) with a libretto by Poet Laureate Robert Pinsky (The Inferno of Dante), offers pioneering performance technologies developed by Machover and his Opera of the Future Group at the MIT Media Lab. The stage represents Simon's house, but this environment gradually reveals itself to be the vast, interconnected, intelligent System of Powers' continuing presence. As the opera progresses, the set personifies Simon's thoughts, feelings, memories and desires. A new technique called Disembodied Performance uses innovative sensors and analysis software to translate Maddalena's conscious and unconscious sounds and gestures into the behavior of the set. In this way, The System reflects Simon Powers' transformed presence even after his physical body is no longer visible to the audience.

In addition to the animatronic set, the opera employs several other inventions developed especially for the production, such as a chorus of "Operabots," which narrate and react to the story, and a musical Chandelier, comprised of long strings that resonate via both remotely actuated electromagnets and by an on-stage performer plucking and dampening the strings.

Triangular shapes were the design scheme, including the Chandelier.

"I started it five years ago," McDowell recalls. "We went to Monte Carlo to meet the people that most motivated it to happen and did a small performance with some of the same singers, actually, and just kept working on it. It's been an interesting kind of labor of love for a while with MIT Media Lab. I came to the project because John Underkoffler [the inventor of the data interface in Minority Report, which McDowell collaborated on] knew Tod Machover and he asked John to recommend someone from outside the theater world but from film. He wanted a different take on how an opera might look.

"I've never done live performance before. It was an interesting design problem because it was tied up with a science and engineering base. There's a very poetic storyline. But how do you bring a set to life? A lot of it was with the rich programming and engineering resources at MIT with a robot lab and musical engineering."

The first design challenge was figuring out The System and what would serve as the metaphor. "What could represent both real-world environment and a cellular system that could be an intelligent entity?" McDowell wondered. "I broke that down into basic cellular components, which ended up with triangular [shapes] at different scales: robots, walls, chandelier objects. Things are pretty much driven by the narrative, but then what can you do robotically to make all of these things become appendages of this character in his spiritual form? So robots become fingers and books; and we imagined that it all takes place in Simon Powers' home, so home equals library and library equals repository of memory and books represent the cells of memory and mainframe and DNA.

"You can come up with the craziest ideas and MIT can invent almost anything. We imagined books that came out of the shelves but were limited by budget. So the primary information delivery systems were comprised of triangular units called periaktoi. It is from the ancient Greek and revolves to change the scene. We had different scenes painted on each side. This idea came out of left field but was completely appropriate. We had like 3,000 spines per triangular wall and 9,000 in total. And each of those had around 12 pixels as a light source, with 100,000 individual pixels.

The System was comprised of real-world environment and cellular system.

"The software that they developed comes out of Tod's ongoing interest in interactivity between performer and musical instrument. They've developed software for a long time that allows a measuring of the performance itself to affect shape of music. Tod invented 'hyper instruments' (a hyper cello played by Yo-Yo Ma, in which shape and gesture of his arm affected the way the note was dispersed into the sound space)."

They took the concept of hyper instruments and translated it into visuals. There were several layers of what could go on the walls, including video playback (with particle-based animated shapes) sitting in a silo where they had access with a musical cue. Then they had a layer of three or levels of measurement of the lead performer who's left the stage (the pitch and timbre of his voice; the gesture of his hands; and his breath).

"You take any measurement and apply it to form, color, scale and animation of the graphics on the screens," McDowell continues. "So you have a keyboard trigger activating a new state and that new state would take the measurement of him singing, say, and have it change the amplitude of the visuals. The next trigger could take that same note and have it change from red to green. It had this very sophisticated live mixing desk but based completely on live performance. I was working alongside the MIT engineers and asked if we could, say, take a spherical object and then move vertically at a very slow speed through the visual space until he sings, at which point, they move into several spheres and move sideways.

"The digital component is we could compile layer upon layer upon layer of these visuals. Now can his breath affect the background color from blue to purple? And you could layer all of that in tune with the live performance and, as close as possible, it's emoting what the performer is doing, without any pre-recording and is completely interactive and intuitive. It's a complete performance that, in the end, was a unique experience for me because everything is about tuning it, rehearsal by rehearsal. Then we had these nine robots that acted as emissaries of his emotions. They were a combination of being driven by live performance and puppeteers. And the walls were all programmed to hit cues at specific points, but the cumulative effect was a multi-layered personality that was going on."

McDowell applied any performance measurement to form, color, scale and animation of the graphics on the screens.

Not surprisingly Death and the Powers has had a big impact on the production designer, founder of the 5D immersive design initiative, who recently completed Upside Down (an alternate universe love story) and is currently working on Andrew Niccol's untitled film (starring Justin Timberlake) about time as the future currency when the aging gene has been shut off.

"It's really opened [me] up," McDowell suggests. "It had a lot to do with 5D really happening at all. There's the whole art/science thing that John and I have been doing for a long time. Media Lab's right in the center of that conversation: they have artistic people who are great scientists and programmers and computer engineers. They are these multi-talented kids who don't see a difference between one or the other, whereas in our industry, we're going, 'Well, this is art and this is science, and this is production and this is post,' making all these silos and separations."

And what about the future of 5D, whose second conference in Long Beach this fall has been cancelled?

"I'm disappointed that it couldn't happen," McDowell admits. "We're still dealing with the socio-economic climate of things and it's all about the difficulty of putting on a conference per se because 5D hasn't stopped at all in terms of distributed events. We're planning an event at the Immersive Tech Summit at the LA Center Studios on Nov. 21. But conferences cost so much money to put on, so what we're experiencing is a Catch 22 in needing an infrastructure to pull in the sponsorship but without the sponsorship not being able to develop an infrastructure. We're in the process of evaluating where to go next with 5D."

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.