Search form

The Oscars: ILM Talks 'Star Trek'

ILM discusses going where no Star Trek has gone before.

Check out the Star Trek trailers and clips at AWNtv!

"Space, the final frontier," looks more like the real thing, thanks to ILM. All Images courtesy of Paramount Pictures.

Industrial Light & Magic more than raised the bar for J.J. Abrams' successful Star Trek reboot: blending the real and the virtual in a very creative way in making a more believable movie in space, and offering a number of advancements in simulation and lighting, which are discussed by Visual Effects Supervisors Roger Guyett and Russell Earl and Animation Supervisor Paul Kavanagh.

Bill Desowitz: Let's first recap some of the new tools you created for Star Trek and their impact.

Roger Guyett: As a broad overview, I would say that the project had all sorts of different challenges because it had such a broad spectrum of work. Most specifically, we did a lot of stuff with virtual pyrotechnics to overcome the issues of doing pyrotechnics in space and dealing with non-terrestrial kind of gravities. And just the creative aspect of being able to place explosions in a movie like this does have that level of space battle, combat, and that was a lot of fun for us to do.

BD: Audiences expect more realism.

RG: Right: And we're constantly looking at the way that we've solved those problems in the past. If you looked at Star Wars, they were just filming elements and compositing those into shots. But they're filming the elements in the real world, which has gravity, and what we were trying to do is overcome -- or at least respect -- the physics of real space. And that was certainly a big achievement on the show. The lighting style was certainly a big departure for us, and Russell can talk to you about the destruction.

Russell Earl: That was one of the other things that we knew we had to deal with from early on, including the destruction of planet Vulcan. So for that we knew we were going to have massive landscapes, we were going to be on the surface and then shots in space, where we were destroying full planets, so we worked with some proprietary tools here -- Fracture -- that we used to break up the surface of the planet or the terrain based on gross sizes, and that could be run through our physical simulation engine to then break and destroy and emit secondary particles of dust and debris. So we had to figure out an efficient way to do that in a variety of scales. In addition, a lot of the destruction was caused by our Black Hole, which was another big simulation.

Virtual pyrotechnics allowed for more creativity.

BD: You developed a new procedural rendering of particles and curves for that?

RE: Yeah, and we had to direct it, so think about how this Black Hole would cause the destruction and everything would eventually collapse upon itself . At the same time, making it exciting and visually interesting.

RG: We used a tool that they developed called Scripted Millions because we were dealing with bigger and bigger sets of data, which we needed to manage. We knew we were going to be in the same kind of zone as The Maelstrom, so wanted to have a better way of handling that information and also look at using different rendering techniques to develop streaks of light. But another big aspect was the shot design and previs.

Paul Kavanagh: J.J. asked if we could have a go at doing some of the shot design as well as the previs, which is really cool when that kind of thing happens at the studio and we can get in there and put our stamp on it. So Roger, Paul and I would get together and talk about the shots and what we could do to make them interesting and tell the story more clearly. And toward the end, we got a group of 17 animators together when we were doing the third act and we would start prevising this stuff in Maya.

RG: We were also using some camera tools to give an interactive feel to the way the cameras would operate.

Fracture provided the destruction of Vulcan.

PK: Yeah, we wanted to match J.J. shots: we didn't want to cut from interior live action, where J.J. had been sitting in back of the cameraman and tapping the magazine. He got this crazy, hand-held look on the live-action stuff and we wanted to match that. We didn't want it to feel like when you went outside to an all-CG shot that it was a totally different feel -- we wanted that same hand-held feel so it actually felt like J.J. actually shot it himself. We developed some techniques to do that with little motion capture sensors, mount orientation sensors mounted on tripods, which we could tap and knock and have that fed directly into Zeno to capture that live on top of the actual gross camera move that we laid out. That worked very well, plus just animating that stuff as well on separate layers on a camera car that we developed to be very user-friendly for the animators.

You know, we could previs stuff pretty fast because we were using our own assets that we developed for the movie, then once the shot was bought off on, there was just a little cleanup on it and then it was right through our pipeline into rendering, so it wasn't a normal process where you do previs and it's just a visual thing, where another company takes that previs and just copies it again. We would actually do the previs as if we were doing the shot and so the shots came together pretty quickly.

RG: It's just a more efficient way of working.

PK: And we built a team that just weren't animators but we took layout people who had a lot of experience with camera work; we brought them into this one group and one person would be responsible for one shot or a sequence of shots and they would do everything. And there was a lot of cross-over there and they really got a kick out of that. They really felt like they were contributing to the creative aspect of the moviemaking process.

RE: Paul and his guys also spent a lot of time setting up the lighting for the shots because the dramatic lighting played such a large component.

PK: Roger came in after talking to J.J. about doing a strong single source look.

Dramatic lighting was also a strong component.

RG: And that was the emotional value of using a lot of darkness, which represented the concept of a lot of unknown out there in space. And that you were going on this journey with the crew, and, as a viewer, I think the movie is very immersive in the sense that you get to ride along with this new bunch of people who are having this experience of traveling off to space and part of that clearly is there is a sense of danger and jeopardy and really trying to match that very strong, single source, high-contrast lighting that you'd see in those old Apollo photographs. And that approach is now attainable for us in CG: we can use an area light or a single source area light and point it at an object and it will fill in some of the bounce effect for you, so you end up with a nicer and more realistic result.

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.