A Night at the High-Tech Opera
The first design challenge was figuring out The System and what would serve as the metaphor. "What could represent both real-world environment and a cellular system that could be an intelligent entity?" McDowell wondered. "I broke that down into basic cellular components, which ended up with triangular [shapes] at different scales: robots, walls, chandelier objects. Things are pretty much driven by the narrative, but then what can you do robotically to make all of these things become appendages of this character in his spiritual form? So robots become fingers and books; and we imagined that it all takes place in Simon Powers' home, so home equals library and library equals repository of memory and books represent the cells of memory and mainframe and DNA.
"You can come up with the craziest ideas and MIT can invent almost anything. We imagined books that came out of the shelves but were limited by budget. So the primary information delivery systems were comprised of triangular units called periaktoi. It is from the ancient Greek and revolves to change the scene. We had different scenes painted on each side. This idea came out of left field but was completely appropriate. We had like 3,000 spines per triangular wall and 9,000 in total. And each of those had around 12 pixels as a light source, with 100,000 individual pixels.
They took the concept of hyper instruments and translated it into visuals. There were several layers of what could go on the walls, including video playback (with particle-based animated shapes) sitting in a silo where they had access with a musical cue. Then they had a layer of three or levels of measurement of the lead performer who's left the stage (the pitch and timbre of his voice; the gesture of his hands; and his breath).
"You take any measurement and apply it to form, color, scale and animation of the graphics on the screens," McDowell continues. "So you have a keyboard trigger activating a new state and that new state would take the measurement of him singing, say, and have it change the amplitude of the visuals. The next trigger could take that same note and have it change from red to green. It had this very sophisticated live mixing desk but based completely on live performance. I was working alongside the MIT engineers and asked if we could, say, take a spherical object and then move vertically at a very slow speed through the visual space until he sings, at which point, they move into several spheres and move sideways.
"The digital component is we could compile layer upon layer upon layer of these visuals. Now can his breath affect the background color from blue to purple? And you could layer all of that in tune with the live performance and, as close as possible, it's emoting what the performer is doing, without any pre-recording and is completely interactive and intuitive. It's a complete performance that, in the end, was a unique experience for me because everything is about tuning it, rehearsal by rehearsal. Then we had these nine robots that acted as emissaries of his emotions. They were a combination of being driven by live performance and puppeteers. And the walls were all programmed to hit cues at specific points, but the cumulative effect was a multi-layered personality that was going on."