Search form

'Revenge of the Sith': Part 2 — Digital Environments Strike Back

In honor of the last Star Wars movie, Episode III Revenge of the Sith, VFXWorld continues its three-part series with an in-depth look at ILMs ramping up of digital environments.

George Lucas wanted Episode III to show off more of the worlds of the Star Wars universe. All images © &  Lucasfilm Ltd. All rights reserved. Digital work by ILM.

George Lucas wanted Episode III to show off more of the worlds of the Star Wars universe. All images © & Lucasfilm Ltd. All rights reserved. Digital work by ILM.

Since George Lucas was planning all along with Revenge of the Sith to travel to more places and more new worlds, the mandate was clear: he wanted a lot more digital environments and he wanted them to be more ambitious, more complex and more photoreal, which not only necessitated ramping up the digital matte department from 12 to 34 (supervised by Jonathan Harb), but also overhauling the pipeline to expand the creativity and efficiency of all the 3D artists.

This was achieved through implementation of the new Zeno package. Rather than training artists in their own specialties on separate tools, ILM funnels a dozen or so programs (including simulations, lighting and modeling) onto a single platform with a common user interface. This allows four or five options for any given shot and ties together various off the shelf packages into different rendering choices, resulting in greater flexibility for a massive quantity of shots, making it easier for the TDs and artists having it all together in one package.

There are around 50 environments overall in Revenge of the Sith, ranging from the volcano planet Mustafar, where Obi-Wan and Anakin have their long-awaited lightsaber duel, to the tropical, Vietnam-inspired Wookiee planet, Kashyyyk, where the large furry creatures battle Clones, to the round sinkhole planet Utapau about two miles in diameter and five miles deep (some of which is completely CG), where Obi-Wan takes on the latest CG villain, Droid Leader General Grievous. Plus a host of other exotic planets such as Felucia and the more familiar Corresant, Alderaan, Naboo and Tatooine, which are seen literally in a whole new light, thanks to advancements in global illumination. And there are a multitude of environmental establishing shots that capture a range of aesthetic and emotional moods.

From the greenscreen stage to the lava flows of Mustafar.

From the greenscreen stage to the lava flows of Mustafar.

Very often Georges vision of a place is something that you cant shoot or build as a complete set, but that you can build as fragments, explains visual effects supervisor John Knoll, who has written a book about Star Wars digital environments, to be published in the fall by Abrams. Weve used different techniques to build these environments, sometimes matte paintings, sometimes miniatures and sometimes computer graphics. Weve pushed the CG end further on this picture mainly for efficiency. One of the things important to do is global illumination. CG traditionally handles surface illumination. But interior environments are completely dominated by indirect nodes. Those kinds of effects can be simulated through the technique of radiosity. Weve started incorporated that into our methodology here so we can achieve more realistic environments. With faster machines, radiosity became a design build early on in Sith. We also used mental ray and for mattes we used SplutterFish Brazil and 3ds Max.

Mustafar, supervised by Roger Guyett, like most of the movie, consists of a combination of digital environments, digital and practical models and CG architecture. We divided the geography into three sections: three major miniatures and those joined together for a continuous piece [utilizing real footage of an erupting Mt. Edna], Guyett suggests. The valley with this lava river was done as CG pieces or practically. Smoke was done separately. Our version of lava was the food-processing element, methycel. Brian Gernand built the largest miniature of the rocky canyon in Star Wars history. Lava is an element photographed on a miniature model as well as CG.

To me, the simulation of real lava here is amazing and you have to go to the dark side of the computer to achieve it. The hard thing is that each shot has so many elements to give you a textured environment. We did 10 different passes of the plates. It is such an endurance test that you might as well be there sweating it out. It took eight months to work on this sequence: the longest miniature shoot in ILMs history, with lots of CG and compositing work. The R&D alone in figuring out how the embers go in the air was hard work. Doing organic elements in Twister and A Perfect Storm certainly paved the way for this smoke and lava. Having the ability to create more photoreal scenes, you can solve certain problems but they always want to ramp it up with more elements as the technology gets better and better.

The most difficult personal challenge for Guyett was actually in determining the speed of the lava flow. Initially, Lucas figure it should be slow, but after conducting his own tests, Guyett decided that it should move quickly. We looked at real lava and some of it looked like water and some of it looked like sludge, he adds. I saw the first test and thought it should be like rapids. I wanted to add jeopardyto make it more dynamic. I almost lost my job, but while George was away I committed to this faster lava speed and told the guys this is what looks best. So I come back after shooting lava for three weeks we mustve spent $1 million and I showed George this shot and he said youre absolutely right as my heart was in my heart.

As for Kashyyyk, it was constructed from multiple models that they photographed, including Thai islands and Chinese mountains. The sky is fabricated, the lagoon areas are comprised of digital matte paintings and different plates from various parts of the world. Meanwhile, the beach area is fabricated based on different types of trees, a combination of miniatures and CG. Pontoons, aircraft, Clones and Wookiees in the background are CG.

The view of these mountains dont actually existtheyre individually cut and pasted, Guyett continues. It was a big challenge in scale and organization. There are so many different elements drawn from so many different sources that you have to really know where youre going. Our plan was to do our version of Saving Private Ryan, which I worked on [as co-supervisor], and to take this tropical environment and make it more muted and overcast. Its a war-torn world that also contains a little bit of Apocalypse Now. We wanted to push it into a harder reality of Wookiee D-Day. If you had told me five years ago that we were going to do a close-up water surface in CG, and were going to have all these other elements, I wouldve said that its too hard to pull off.

The Wookiees prepare for their own D-Day invasion.

The Wookiees prepare for their own D-Day invasion.

If you looked at Georges original animatics, he essentially has the shot, but Ive got to figure out that this foreground explosion has to be this big, this aircraft has to fly down over here and is just going to miss this thing over here and divert over there and at the end of the shot this aircraft comes in and starts firing. Its similar to a live-action shoot. Youd rehearse it and shoot it and then tweak it. It took six weeks to pull off huge matte paintings, CG water, photographed individual elements, digital and real smoke created on stage. Why, the Wookiees alone took 1,000 gigabytes of storage.

As for the matte department, it pushed digital environments to new heights at ILM. Not only were artists on staff retrained but also others were recruited from colleges in California, North Carolina, Florida, Ohio, London and New Zealand, and via the Internet in Canada, Japan, Sweden, Africa and Australia. The matte artists used Max, Maya, XSI and Cinema 4D and composited 2D elements into 3D using After Effects, Shake and their own proprietary tool.

We were robust and capable of delivering whatever George came up, Harb offers. He delivered a concept and a look for a world along with an animatic. A good example of that was the outside of the Opera House. This is one of those environments where all we had was a camera move in the computer, so we came up with pretty detailed 3D models, textures, traffic signs and people. Also, inside the Opera House is nearly entirely virtual too.

Meanwhile, the digital establishing shots are stunning. Utapau establishing shots start out as simple geometry with these wavy shapes, Harb points out. There was also a big practical model built of the sinkhole. [The artist] went over to the model shop, took some digital stills and that afternoon had the sinkhole integrated.

The establishing shots of the fiery world of Mustafar.

The establishing shots of the fiery world of Mustafar.

The Mustafar establishing shots ran for months. The look of this place everyone finessed a lot. One artist completed one shot with all of these little animated elements: Every single splash, every single little puff of smoke. He then placed it onto a card and manipulated it to coincide with whatever else was happening at different paces: the waterfalls, the topography, the erupting volcanoes and their moving smoke, the complex itself.

With Mustafar, they sometimes used the principle of height fields: From above, the tops of the peaks are bright white and the plains are black and everything in between is gray. The artist goes through and sculpts this entire environment and whats neat about this environment is that theres a lot of computer graphics. Youre going to build this rock model in a traditional CG approach, put pictures on this model, turn the light on in the computer and then render several passes. And it will then be composited with other elements shot somewhere else.

Another more direct approach is dont even turn the light on in the computer. Just paint the lava or waterfalls, project it from the point of view of the camera and stick it to geometry and then render it all together. Whats great about global illumination is that we dont use it in a typical way in the matte department. What we want to know from global illumination is what is the quality of the light in a building? What is the quality of the highlight like? What is the quality of that transition across its face like? But we dont want to render it for every frame because its very processing intensive. Well render one frame that is representative of what we want.

For digital matte artist Yanick Dusseault, his challenge was to not only create more beautiful paintings but also to create the same environments in different ways. I just try to understand what makes an image so interesting and reproduce it, copying what had been done before in terms of architecture and landscape. Naboo is a good example. You play with light differently. Making it backlit changes the mood, makes it more dramatic, more beautiful, which is important when you have so many different planets and environments flying by you so quickly.

Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image

Bill Desowitz is editor of VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags