Search form

Meet the 'Sky Captain' Visionary: Q&A with Kerry Conran

Bill Desowitz discusses the groundbreaking Sky Captain and the World of Tomorrow with first-time director Kerry Conran.

Kerry Conran and his world of tomorrow. All images  & © 2004 Paramount Pictures. All rights reserved.

Kerry Conran and his world of tomorrow. All images & © 2004 Paramount Pictures. All rights reserved.

Kerry Conran has been intrigued with the blending of animation and live-action ever since he was a film student at CalArts in the late `80s. Haunted by an incomplete student project along those lines, he was determined to realize his dream with an action-adventure feature that paid homage to the Fleischer Superman cartoons and movie serials of the `30s. So he experimented for several years on a Mac in his Sherman Oaks apartment with a six-minute short that would eventually serve as the basis for Sky Captain and the World of Tomorrow, the pulp sci-fi actioner with killer robots starring Jude Law, Gwyneth Paltrow and Angelina Jolie. The $70 million-plus indie pickup by Paramount Pictures, financed by Filmauro and producer Jon Avnet, opens Sept. 17 after much fanfare as the most elaborate bluescreen shot and composited feature ever made. Conran recently talked with VFXWorld about the challenges of making Sky Captain as well as plans he has for his second feature, Edgar Rice Burroughs A Princess of Mars.

Bill Desowitz: The obvious question is why make Sky Captain this way?

Kerry Conran: The real reason was that it was the only way possible to make a film with limited resources [in our Van Nuys studio]. And where it started and what it evolved into was slightly different. But it resulted in enormous savings in both expense and time for the principal photography. When I got out of film school, I was looking for a way to make an independent film that had some scope to it, and around that time I was looking at the tools that were available. It wasnt more than a year or so that the first caller Macintosh had been introduced, and it was well before the modern effects film had come along The Phantom Menace, The Matrix and The Lord of the Rings. The idea of doing something wholesale like this was at the time prohibitively expensive for the effects community or for Hollywood. Strangely, I think, if they had tried to enlist the services of ILM or somebody to do what I was proposing, it wouldve been impossible. But for someone to take those same tools and do it on their own, it certainly was possible. So it took four years of developing a method that borrowed the techniques of 2D cel animation and I tried to use compositing to essentially facilitate the same effect in live action.

BD: What did you use for software in the beginning?

KC: At the time, there was a piece of software that had just came out called After Effects. And to me it was analogous to the animation kit stand or optical printer in a way, and just suggested live action in a way that wasnt possible in the past. I just knew that if I wanted to shoot in New York City or the Himalayas, that I could do it without ever leaving my apartment, essentially, and go for a more stylized look with something that I could achieve with some measure of production value on my own. And thats really where it all began.

BD: What were the major adjustments when you finally went into production on the feature in Van Nuys?

KC: It became more ambitious in scale. It went from a black-and-white film to a color film. It went from a film that I was going to shoot entirely in front of a bluescreen to a soundstage in London at Elstree, where Star Wars was made. It was a film that originally wasnt intended to have any stars. But at heart, at core, nothing really changed in terms of how we were going to do it. It was just a little more than I had started. Principally, it was founded on and executed with the same spirit as the short film.

Previs in all its forms helped make this complex visual adventure come true.

Previs in all its forms helped make this complex visual adventure come true.

BD: And the production process consisted of 2D storyboarding, 3D animatics, principal photography and compositing.

KC: Obviously from the script, I always like to storyboard and it does give the animators some place to start. So we storyboarded every frame of the movie, but we would trigger it, so that I would only do a sequence in storyboard and hand that off to the animators to begin animatics while other sequences were still being storyboarded. Once it was storyboarded, we would have concurrently been designing the sets for the purpose of animatics and begin the process of interpreting the storyboards into an animated dimensional form. We did so with the benefit of having recorded the actors voices much like an animated film. It did help suggest timing and pace and all those sorts of things. With the animatics, I pretty much used friends as stand-ins at a smaller bluescreen stage at the facility in Van Nuys. We began to shoot a crude test version of the film with stand-ins, but nonetheless it was a live-action version. So we began to interpret the animatics, and it was through that process that we learned of potential problems and worked out the bugs, so that when it was time to shoot the film in London, we had a pretty good system worked out, and we essentially had made the film over three times. So there was very little doubt or chance that we had missed something.

BD: One of those problems had to do with the focal length

KC: Thats correct. We found that out during our stand-in version, which we called our Tech Shoot. We were consistently off by 10 inches from what the animatic measurements were to live action. We couldnt translate those. For wide shots it made no difference, but for close-ups it was significant. We discovered [during separate field tests] that Maya is set up for film and measures from the focal plane, whereas HD camera measures from the tip of the lens, and the lens we were using was 10 inches long. It wasnt a huge issue, but it was an error that we couldnt account for, which could have a ripple effect. Fortunately, this prep work spared us in London. So we only had 26 days with the principal cast and that meant that nothing could go wrong.

BD: Prior to that you developed a virtual map?

KC: Yes, for the process of translating the animatics to live action, we essentially built Elstree Studios inside the computer and labeled the ground and walls with little grids that had corresponding markers. It was like a big, elaborate version of Battleship. And we translated that grid system to the physical set in London, and where ever the camera and actors were in our virtual set, wed simply look at the marks on the floor in the real set and know our camera belonged on G1, and if the actors raced across JH to K7 theyd fall exactly in line with the backgrounds and framing and compositions we had created in animatic form.

BD: And then you went into the animation and compositing.

The tight shooting schedule made it crucial to have shots planned out when filming on the bluescreen stage.

The tight shooting schedule made it crucial to have shots planned out when filming on the bluescreen stage.

KC: Correct. Once we had photographed the live action in London, we sent that back to Van Nuys and began editing almost immediately. Animation began even as we were shooting in London with modeling and texturing. A lot of the shots that didnt involve live action directly were begun. Any scenes where backgrounds required photographs to be manipulated as a backdrop, that work began as well. Once the live action was brought back, generally the keys were made so we first had a pass for the compositors.

BD: Was there any special multi-plane software made?

KC: We did create several tools within Maya and Shake that were unique to this production in both that aspect and the way that we had to get through every single shot, where we approached sequences philosophically different we didnt approach them as effects sequences. When we had a virtual set, we would complete it like a set, and not as a series of shots. We then were able to set up camera rigs that within one project file if the scene contained 30 shots, we would set up 30 cameras within a single project file. And because the set was essentially pre-lit, the cameras were all good to go you could hit render as if you had been on a physical location; and you had the benefit of the lighting that was realized for a whole environment. And then, as with any live-action shoot, you would just bring in tiny little accent lights to fill in details or to fill it out.

The staff on the picture started out small in Van Nuys.

The staff on the picture started out small in Van Nuys.

BD: And what about special lighting techniques?

KC: Our CG lighting director, Michael Sean Foley, and our CG supervisor, David Santiago, developed quite a number of tools for this that were very unique and very specific to this production. And thats what also helped the enormous rendering task that we faced [thanks to a symmetrical lighting plot]. The rendering of New York City alone was staggering, and many, many things had to be sorted out with as many shots as we had to get through.

BD: How large a staff did you have in Van Nuys?

KC: We started out with about 20 to 30 people before we went to London, and I only had four animators who actually created the animatics throughout the entire film. And if you look at the animatics, that was how the film was made. Once we came back from London, we scaled up to about 50 people that did the bulk of the work. I think we were maybe up as high as 80 people.

BD: And where did you recruit most of them?

KC: Well, starting out, Disney was dismantling Secret Lab and thats where we managed to get our effects supervisor, Darin Hollings, and our animation supervisor, Steve Yamamoto. And it was really Darins dog in this that brought the other people, just friends of friends, people responding to ads and that sort of thing. But the interesting aspect of the whole production was that we had no money, so we could not afford to bring in experienced people apart from Darin and a couple of [others]. Most of our other crew, animators and compositors, had never really made a film before. This was their first break, but it was a leap of faith on all sides. It was all we could really find and afford. And it was Steve Yamamoto who really conducted school for these kids and taught them to become animators, and it was really an amazing experience. And it was the same for Steve Lawes, who came in as the compositing supervisor. That was probably one of the most rewarding aspects of this: you saw these inexperienced youngsters, not unlike myself, really develop and flourish. Ten years or 10 films worth of experience crammed into this one project. The responsibility and the work that they all had. And right now half of them went to work at Digital Domain and some went to Pixar and other places. The door was swung wide open for them, and that was one of the single most gratifying aspects of this that happened to me.

BD: The compositing experience is interesting in that both you and Darin come from competing colorization backgrounds.

The film was constructed in black-and-white with color added later.

The film was constructed in black-and-white with color added later.

KC: We did a lot of experimentation in color and toyed with many things because we didnt want to deal with it. But we did composite the entire film in black-and-white until the bitter end, so that the film was made in black-and-white. And we did investigate outsourcing and were really appalled at the results that we got back, so we decided to do it ourselves and develop a different method. We tried to emulate two and three strip Technicolor and tried to emulate films like Black Narcissus and the Technicolor [dye transfer printing process]. And you cant come close to that, but we were trying for something different and using it as a guide. One of the compositing steps was the two-strip layer, so we actually did create a sort of limiting layer that went in and mimicked some of the steps and evoked that processing. We split off our main compositors who actually composited the shot, and we developed a separate color department; and this was the team that really took the final black-and-white composites and put the color on top of them. They were skilled compositors but their total function was to do color.

BD: The original plan was to have it all be in black-and-white except for Shangri La, which was in color to evoke The Wizard of Oz?

KC: Thats exactly right, but alas

BD: The one aesthetic compromise. But you told me at SIGGRAPH that youre hoping to have both color and black-and-white versions on the DVD.

KC: Yeah, we have it. Were ready to go with it. It remains to be seen how to best present it at this point.

In the end, 14 visual effects companies helped bring the explosive world of Sky Captain to life.

In the end, 14 visual effects companies helped bring the explosive world of Sky Captain to life.

BD: You also mentioned at SIGGRAPH that you preferred the black-and-white because the nature of the vintage color looks more dated.

KC: I think so. I think at times it evokes those Victorian hand-painted photographs and to me I relate that to older. If you look at any black-and-white film into the `50s and `60s when color was available, it was easier to [evoke the past]. For me, it was done [in color] for commercial reasons to make it more palatable, to modernize it, but, in fact, it may have had a different result.

BD: It mustve been very challenging to work with the number of visual effects houses that were outsourced, supervised by Scott Anderson. I counted 13 that contributed nearly 50% of the 2,000 effects. [ILM, Luma Pictures, Rising Sun Pictures, Hybride Technologies, Ring of Fire, SW Digital, Pixel Liberation Front, Pacific Title Digital, Café FX, Engine Room, The Orphanage, R!OT and Digital Backlot.]

KC: We did make the deliberate choice of only handing out entire scenes, so you didnt have one company sharing Nepal with another company. And in some instances that mightve only meant the animation they couldnt do the compositing. And it was really there that we were able to maintain consistency from scene to scene, and with the film we took great measure to work with each of these companies, who really worked hard knowing they had this incredible challenge of mixing and matching. And we created essentially keyframes for them that we composited as if we were going to do the scene so they had a reference point that would be like if we had done it. We also had given each company about three shots that we had already done the elements for them and asked them to duplicate them, seeing our final result so they got used to it. And we were able to see very quickly how close they were able to come on their own. And then our guys were in direct contact with them and it was a really fluid, free exchange of information so that it would all look like the same movie.

BD: So how close did you come to realize what was in your imagination?

KC: Well, you know, on the one hand, its close. Like anyone else, I wouldve liked more time to perfect certain things, but Im quite happy. But one of the great things that came out of this that I wasnt even thinking of was the opportunity to work with such amazing artists, and what they brought to the film

Lessons learned on Sky Captain will be brought onto Conrans next outing, A Princess of Mars.

Lessons learned on Sky Captain will be brought onto Conrans next outing, A Princess of Mars.

BD: What next? A Princess of Mars?

KC: Yes.

BD: And how will it be different in approach?

KC: Just the look and feel of the John Carter world would be meant to feel a little more realistic.

BD: And youll be shooting on actual locations

KC: Right. I think it will be a combination The things we learned on World of Tomorrow were significant and this will be something of a hybrid. Instead of 100% of what he did on World of Tomorrow it could be 50%. But then I think there are certain things that the computer is not best suited at right now when you try to mimic certain kind of environments, be they organic. The difference is that the onus is not on the John Carter series. Its a different kind of animal. But it will be the beneficiary of what we did on World of Tomorrow in a big way.

BD: How are you adapting it?

KC: We are essentially trying to be as faithful as we can to the Burroughs book. We wont be reinterpreting it but adapting what Burroughs wrote in screenplay form. John Carter is sort of immortal and where it would begin is approaching modern day, but he is still a Civil War veteran and all that would be intact. Obviously there are certain things that have to be done to translate it to the sensibilities of todays audience.

The animatics helped the actors understand what they were looking at and interacting with.

The animatics helped the actors understand what they were looking at and interacting with.

BD: I understand that youre looking for a new writer, but, stylistically, this fantasy about Carter being mysteriously transported to Mars must offer a lot of intriguing possibilities for your brother [production designer Kevin Conran] and you.

KC: He will be on the death march with me again. Were really looking to do something that we havent really seen before taking the notion of an alien world and embracing it, creating vistas and a backdrop and an environment that will be pretty remarkable.

BD: What about references?

KC: I think its going to draw upon some of the references in World of Tomorrow some of the early pulps were pretty amazing in their interpretation of this. I dont think you want quite the dated quality of that, but without a doubt, there will be that sentiment involved to some extent.

BD: You had also mentioned at SIGGRAPH that you were looking around for new tools to help you with previs. Did you find any?

KC: You know, I didnt find anything that I didnt already know about. What was interesting to me for previs and production purposes, for instance, was what NVIDIA is doing with Gelato and these hardware-assisted renderers. I know what we went through and if you could just whittle some of those times down, it just gives you more opportunity to experiment and more opportunity to perfect shots. So I think incorporating potential realtime rendering solutions, particularly for previs, but even possibly for some of the layer work that could really be quite amazing. Given the animated nature of this movie at times, I think that some of the motion performance hardware and software thats available would be taken to an extreme measure here to capture some subtlety and nuance and performance like the 15-foot tall, four-arm Tharks, creatures that you would want to have some feeling for [and] very skilled actors to play the part.

BD: Thank goodness for previs, right?

KC: Oh, its invaluable. Paramount is taking a risk; they dont usually bring in people at this stage, but they recognize the value of it, they recognize the enormous economy that represents itself later. So it does require an upfront expenditure thats foreign and scary, but the result is quite significant on the backside of things. I think it is something that is more and more embraced.

Bill Desowitz is editor of VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.