ILMs Eric Brevig discusses with Ellen Wolff the challenges of matching virtual CG work with live-action cinematography on The Island.
Veteran visual effects supervisor Eric Brevig has worked with first-tier film directors for years with Steven Spielberg on Hook, which earned Brevig an Oscar nomination; with James Cameron on The Abyss; with Barry Sonnenfeld on Men In Black and with Paul Verhoeven on Total Recall, which won Brevig a special achievement award from the Motion Picture Academy. This summer, The Island reunites Eric with director Michael Bay. Their last pairing, on Pearl Harbor, earned Brevig an Oscar nomination, and he understands Bays dynamic shooting style well.
The Island (from DreamWorks/Warner Bros.) is Bays look into the repercussions of human cloning in a futuristic society. We see the consequences from the point of view of two clones played by Ewan McGregor and Scarlett Johansson who are running for their lives. Although the movie was slated for U.S. release on July 22, Brevig was working on the visual effects at Industrial Light & Magic well into the first week of July.
They moved up the delivery date two weeks before we were done. It was one of those things where we said, OK, nobodys going to sleep. We worked literally day and night through the Fourth of July weekend, and then we were done.
Ellen Wolff: Because you can deliver so many last minute changes, do people think anything is possible?
Eric Brevig: Yes. I keep saying that I wish someone would fail, although I was worried it was going to be me! We had a giant network meltdown over the Fourth of July weekend. Getting people to retrieve data when everybodys on holiday made me think Ohhhh this could be the one!
EW: How many visual effects shots did you create for this film?
EB: Around 400. It was supposed to be 250. And we didnt finish principal photography until mid-March.
EW: A key location in The Island is a clone factory. What was ILMs role in revealing what goes on in that factory?
EB: Theres a great continuous shot where weve just seen one of these agnates which is what the clones are called pulled out of a plastic bag. These bags are slit in a sort of birthing. They pull out these adult embryos and put them on a gurney. Then the camera starts floating towards this big open window where we see where the clones came from. When we look through the window were about 100 feet up, looking at this giant cylindrical room filled with these little pods. Then the camera dives down towards them. We see that inside each of them, in various states of completion, are these embryos.
As the shot continues, the camera follows one pods nutrient tubes through a hole in the wall to another room, where you see clones are being injected with all kinds of vitamins. To do that all in one shot required combining and enhancing pieces of live action that we shot on three different sets. When we shot the birth of the agnates there were dummies all over the ground. But these dummies didnt quite move enough, so in post-production we digitally animated some of their extremities so that when the camera is pushing in on them, theyre wriggling. Then at the end of that shot, we transition into a simple 3D model of one of the walls and we fly into that and then come out on the third set where the agnates are receiving their injections.
EW: Did you use CG animation to create the tiny CG robots that had to crawl around Ewan McGregors face?
EB: Yes, and thank goodness they were CG, because otherwise Ewan would have been in worse shape than he was. The story point is that Ewans brain is exhibiting unusual REM cycles so he is locked into this futuristic kind of dental chair to have a brain scan. A syringe is touched to his cheek and a dozen or two of these pea-sized nano-bots crawl across his face and under his eyelids. It was horrific, and I had to shoot the plates!
But Ewan was a trouper. We did this over and over. We used a computer-controlled rig like a snorkel rig to fly the camera right up to his eyes. Then we had lot of very good animation by Scott Benza, our animation director, at ILM. We even put in the burrowing like a little mole in the ground when the nano-bots crawled in under Ewans eyelids. You could see his skin being stretched out. Im real sensitive about that stuff, and we had to watch those dailies for weeks.
EW: How much of the clone factory itself did ILM create?
EB: The interior set was built, but it has windows that look out to a beautiful ocean and idyllic landscapes beyond which were plates shot in New Zealand. So even the non-effects shots had these bluescreen windows that we had to track. But the main visual effects contribution was at the beginning of the movie and at the end. At the beginning of the movie, we follow Ewans character as he climbs into a glass elevator thats on the outside of the building. As he starts to descend, the camera flies back and reveals that theyre in an elevator on a building that has other elevators next to other buildings with elevators. We pull out and establish this whole giant place teeming with clones.
Then when Ewan and Scarlett begin their escape, they run across a catwalk that looks like its a bridge. But halfway through, they suddenly run through a plasma-like holographic screen. They find themselves on the back side of the screen and theyre in what looks like the inside of Hoover Dam. Its a giant containment bunker area and theyre on a maintenance catwalk 800 feet above the ground.
EW: Thats when the characters discover that they havent been living in a real world?
EB: Yes. Its great because we designed the shot so that the audience up to that point wouldnt know what was coming either. So the audience and Ewans character discover it in the same moment. Then the camera pops back really wide and we see these two tiny figures, which are CG in that shot. We see the entire containment area with this holographic ring around the buildings in the center. Its all seen in one shot.
EW: Once they escape, The Island becomes a non-stop chase movie.
EB: Absolutely. And the pinnacle of that is probably eight to 10 minutes of just chase. I would guess there were just five lines of dialogue during the time when Ewan and Scarlett are on that on the thing that looks like a flying motorcycle, called the Wasp. Theyre being chased by bad guys in cars and in helicopters, which are called Whispers.
Ewans and Scarletts characters start off on foot and then they jump on the back of a big flatbed truck thats carrying railroad car wheels. Theyre being shot at by both the Whispers and guys in cars. They release some of these wheels like giant barbells and roll them off the back of the truck. So we have some Michael Bay mayhem when the car crashes. Then bad guys on one of the Wasps shoots the truck driver and he crashes. As they come back to finish off Ewan and Scarlett, Ewan manages to knock the guy off the Wasp and then he and Scarlett jump on it. He figures out how to fly it, and theyre then chased by bad guys on another Wasp down the freeway into downtown L.A. of the future. It has lots of aerial mass transportation big aerial buses that are somewhere between a gondola and a monorail, high above the ground. Those were all CG.
Downtown is played by both L.A. and Detroit, but both of the cities are futurized with lots of these buses and monorails. We also built futuristic CG buildings and put them in the background behind real buildings, or we added upper levels to existing buildings.
The Wasp chase, as it escalates, becomes a combination of intercuts between stunt people on physical Wasp rigs on wires, which were shot racing down the freeway. For those, we had to paint out the wires and add flames to the back of the Wasps. We shot bluescreen of the actors on a motion base and we also created an all-CG version with digital characters.
EW: Did you do any face replacement of stunt people?
EB: We didnt do face replacement. In wide shots the stunt people looked enough like the actors and in Michael Bays style of photography the camera is shaking! I dont think that in the continuity of the movie, you notice the stunt people at all. There are quick cuts and the stunt doubles are going by fast.
EW: How close does the camera get to the digital versions Ewan and Scarlett?
EB: We probably have more detail than we need, but you dont know what you will need when youre cyberscanning the actors. The most complicated aspect was their clothing, because for the most part when we see them theyre flying along at 80 to 100 mph on whats basically a motorcycle. So you see the simulation of their clothing and her long hair more than their faces. That caught us by surprise, and we put a lot of effort into simulating windblown hair and the leather that their clothes were made out of. That solved the problem of what the audience sees of the digital characters.
EW: ILM has really nailed the simulation of CG cloth ever since Pirates of the Caribbean. Did this movie require any further developments?
EB: I think its been a natural evolution, where weve built on what weve already done. Pirates didnt have long blond hair blowing in the wind, so we had to spend time to set up and light it and make the dynamics of it work. The cloth sim was pretty well sorted out on Star Wars: Episode 3. Thats where they took the Pirates base and built on it. We were the lucky beneficiaries of that.
EW: Scarlett Johansson doesnt have experience making visual effects films, but Ewan McGregor must have the process down pat after making the Star Wars prequels.
EB: The first thing I said to him when I met him was: Id like to apologize to you for everything that youve had to go through on Star Wars. I told him I would try and make this different for him, and I think it was.
EW: So he had the benefit of acting with practical sets instead of the largely virtual sets of Star Wars?
EB: Yes, although he did have to play a scene opposite himself.
EW: You chose not to have a virtual replacement for him in that shot, but instead you filmed two passes with a motion control camera.
EB: Yes. When you have an actor in a dialogue scene with himself and its someone who can do this kind of work, the results will be so much better if he can be in both sides of the shot. I knew he had the chops to be able to pull it off. What we did to make it something special was that instead of doing it with a lot of cuts we have these big dolly shots where we do one continuous shot for maybe 45 seconds of him talking and moving and reacting to the other character thats also played by himself.
At one point in this scene, one character played by Ewan grabs the others wrist. Stuff like that makes you feel like youre advancing the art. It took a lot of rehearsal. We had an excellent photo double. He had what Ive decided is an innate skill like perfect pitch where you get it and you understand what you have to do. You have to take physical direction of how to adjust your body and move a half-inch to the left, for example. I worked with him so that he could be exactly where the second Ewan character needed to be when we were filming. I gave them a lot of cues. Traditionally, we would film one side of the conversation with the motion control camera locked. Then wed take the audio out of the other persons dialogue. Then wed play back first persons dialogue when Ewan was playing the second part. That was all business as usual.
But then on top of it, we had to have this physical contact between them. So I just worked with them, using the playback audio almost as choreographic beat track. I would very politely say, That was great but you have to be a half step to the right on the syllable of this word, and youve got to get your elbow down here and move your arm forward. They were actually able to do it. I think it took two days to get one really long take because of all the technical aspects of it. But when you see it on screen, you dont even pay attention to it.
EW: Did the fact that the camera was dollying help cover any problems?
EB: No, it killed us because we couldnt flip the sync. It was definitely working without a net. We had to do it within a close enough tolerance so that when we did have to flip the sync and rotoscope the photo double out of the background and create a background that didnt have him in it, the perspective still held up.
EW: You get a second unit director credit on The Island. Was this one of the scenes you directed?
EB: No, because this scene had principal actors on a principal set. Its actually a wonderful dramatic moment because one of Ewans characters is really playing cat-and-mouse with the other one. Its not a trick shot at all. The stuff that I directed was more action or effects-related. Or I would shoot Ewan and Scarlett doing something against bluescreen that would be part of a scene that Michael Bay had already shot.
EW: Didnt Michael previously give you the opportunity to do second unit directing on Pearl Harbor?
EB: Yes. I didnt realize it at the time, but apparently he is very hard on his second unit directors because he has such a specific eye for photography and choreography. But somehow I passed the bar.
EW: Was it difficult to match the virtual camera work in your CG shots to Michael Bays stunt camera choreography?
EB: We definitely had that problem on this show because we had so many cameras. We had a Strata crane in its longest extension mode and Michael loves to choose Panavisions older anamorphic lenses because he loves all the artifacts and funkiness about them.
The problem for us at ILM was that he was using these on exterior locations where we had to augment the sides of buildings with CG tramways and add additional stories to the buildings. As the camera is craning all over the place, the buildings are literally warping as we cross frame. And it isnt a classic barrel distortion. It was a random fluctuation based on the older lenses and where the elements happened to be on that day. So we chased the geometry of shots for weeks, just trying to figure out why when we had a match move that was flawless in the bottom of the screen the upper left would be moving to the right.
We finally just came up with a very sophisticated series of lens charts and tests. Every lens that we could get our hands on we put through this. Then we could diagram at a certain focal length how it changes when you rack focus. We had to basically decode or undistort the image so we could stick our CG things onto it and then re-distort it after everything was put together.
EW: Given Michael Bays style of camera movement, did your matchmovers have a lot of work to do on this show?
EB: They did. We had a crew with both matchmove information gatherers and other data gatherers. Basically we blanketed our locations with still photographs and other methods to allow us to forensically recreate exactly the shape of the buildings and so forth. We built several models of places that wed been, so that when it came down to match moving, we could match move with some sense of efficiency. And if they didnt sit exactly right on the screen it was because the lenses were distorting the shot.
We could also use those models when, in a couple of cases, we had to synthesize the background. In one shot, for example, I was shooting from a camera car with a camera crane on the top of it and I moved up 10 feet. Then later on, we decided that it should be a 30-foot boom up. So we projection-mapped the street that I had filmed onto the model that we made and we could break the camera loose and continue the boom up and reveal surfaces that hadnt been seen by the camera. We did this with only the usual amount of chaos!
EW: This movie benefited from the evolution of ILMs Zeno software system. I recall that when you used Zeno back on Pearl Harbor it referred more to a hard body simulation system. Has it morphed into something gigantic?
EB: It has. I have a hard time even describing it. Basically what it is a working environment in which all our various tools lighting tools, sim tools, etc. are accessible within one interface. This may be a bad example, but you know how Adobe took its suite of software tools and made them talk to each other? I think it was sort of the same idea. People were having to translate data from one software application to another to go from animation to lighting, to simulation to camera projection and so forth. Basically it was time-consuming and cumbersome.
So ILM decided to rewrite the tools so that everything could be done in one environment. That allows an artist to be able to leap over and add a light or do a camera projection while hes still in his animation application. So the idea and it seems to be very successful is that you can have more efficiency and authorship over your shot because you can do more. Its all got the same user interface and everything is logically consistent. You dont have to use five different software applications.
One of the other aspects of it that is really exciting is that by doing camera projections onto simple shapes, the digital artist can freely trade back and forth with the 3D CG artist. For example, in our L.A. backgrounds, we might have a building that is modeled in 3D and rendered. But if it suddenly needs some detail added to it, instead of having to going back to 3D, the digital matte painter can take that same model and basically paint on it in the shot. So we have fluidity about being able to solve a problem with the right tool.
EW: Why was Zeno the piece of software that was chosen to become this framework?
EB: I think what happened was that it was a piece of software that was written in-house to solve some problems, and the more we looked around, the more we realized: It would be great if we could also do this in Zeno. In a lot of cases, there wasnt an off-the-shelf solution. So it grew until it became the main working environment as opposed to a tool to solve a specific problem.
EW: Did that give you, as a supervisor, any iterative abilities when you sat with an animator?
EB: I have to travel around and talk to so many people throughout the day that changes usually are made after I leave a workstation. While its very interactive, nothing happens fast enough! The only time that I would actually be doing that would be when we had a problem that just didnt make sense. Then wed have to open it up and maybe move the camera around and look at it. Thats basically when I get to look under the hood.
I think The Island and War of the Worlds were the first shows to take advantage of this system. As we were doing it, we were troubleshooting it for the next show. It really seems like such a smart way to work. Were able to do more things with fewer people and not lose time or risk communication errors. You dont have to go through four departments to get a shot up and running. One aspect of it is that it allows ILM to work simultaneously as a giant factory and as a boutique.
EW: Does Zeno accommodate software applications that werent written at ILM?
EB: I know that Maya and RenderMan and other software are somehow within the Zeno environment.
EW: Speaking of RenderMan, what were the challenges of rendering so much material that had to be integrated with Michael Bays plate photography?
EB: Digital does well with images that are sort of sedate. But everything that we were doing on this show involved cranking it up so that it was visceral. You start to immediately see the flaws in motion blur alone. No one has written software so that the camera can be shaken six times in one frame. We had to solve a lot of those problems. We had to figure out a way to create motion blur because normal RenderMan motion blur usually has a sampling rate that is just enough to look good. But if youre vibrating the camera at the same time, theres no way that suffices. We had to render things that would cut in and feel like they were part of the same piece. We had to really analyze what happens, for example, when a shaky camera films the flaming exhaust of a flying jet bike. Those are things that nobody actually wrote something to take care of. What an oversight!
Also, with dark vehicles like the ones in this movie, you dont have as many places to hide because the diffuse aspect of the lighting is not as significant as reflections and the specular highlights. Because we have a good method of capturing the data on location, we can build the appropriate environments for reflections. It just came down to finding the right balance. What we did was render the main aspects of the shot diffuse lighting, reflections and things like that as layers. So we could, in the compositing, balance them and not have to go back and re-render. The renders were expensive and time-consuming, so we didnt want to have to do them more than once.
EW: Even though you worked out of ILMs facility in Marin County, was any of the rendering done remotely at ILMs new facility in San Francisco?
EB: Yes. I believe that either all of it or half of it was done across the Golden Gate Bridge in the new facility. Its just the people who havent moved over there yet.
EW: So we can imagine all these hermetically sealed machines, crunching data in the dark? Thats sounds like a sci-fi scene itself.
EB: Exactly. What if the doors were locked when we got there, and they didnt need us anymore!
EW: Looking back, what do you think was the biggest challenge on The Island?
EB: More than anything else it was the incredibly short time frame. It would have been impossible to do something like this a year ago, and also be able to improve the quality levels of the work. Michaels standards are so high for the imagery in the film. To meet or exceed those, and do it in about a third the time of a normal post-production, is quite an achievement. And its something that we should never try to do again!
Ellen Wolff is a southern California-based writer whose articles have appeared in publications such as Daily Variety, Millimeter, Animation Magazine, Video Systems and the website CreativePlanet.com. Her areas of special interest are computer animation and digital visual effects.