Search form

'V' for Awe-Inspiring VFX

Zoic's Andrew Orloff fills us in on the awe and wonder behind V.

Zoic worked on the design of the mother ship early, paying homage to the original and sprucing it up with an alien alloy-look. All images courtesy of ABC.

V is a reworking of the alien invasion series from 1983 now airing Tuesdays on ABC (8/7c), through this month and then picking up next March to complete its 13-episode season. Exec produced by Scott Rosenbaum, Scott Peters, Jace Hall, Steve Pearlman and Jeffrey Bell, the vfx for V is being handled by Zoic under the supervision of Andrew Orloff, who spoke to us about the challenges of the series as being very different from Battlestar Galactica.

Bill Desowitz: So how did you get onboard V?

Andrew Orloff: We were approached early on (early script stage) to help conceptualize it. And it's really a cool project because it's not often that you participate at the beginning of a franchise, though we were able to do that withBattlestar and Serenity. But this was special in that we were really involved in the design, especially the ships and their interiors, as I said, right from the beginning.

So, part of the original network pitch during the pilot phase and once it got greenlit, was going in and getting the new V mother ship design, and that entailed working with the concept artists here at Zoic and doing a couple of live modeling sessions with the executive producers with a laptop doing shapes, which was tricky getting it just right because we wanted to pay homage to the original series but update it. The original V was very saucer-like, so we wanted to keep it somewhat similar and have certain design elements -- like there's an equator of detail in the center of the old ship -- but we've just updated the shape to make it look more contemporary and visually interesting.

And instead of looking like great panels it looks like some kind of alien alloy with a different plating texture on it. So we had all of that done in the early, early phases and as soon as they got a production design team on, we started working with them and they came up with the original shape-design: a very, very loose sketch from the shuttle, which we turned into a fully-designed model and shuttle craft.

Like the mother ship, the shuttle craft was modeled in modo, textured in Photoshop and rendered in LightWave.

BD: Now what about working on the interior?

AO:

It was also decided during the production phase for what the creative vision was for the show that they could not build the interior ship structures -- the sets-- practically. Because they needed to be too large, the corridors were too long, the ceilings were very, very high. And for the sake of practicality, the original idea that we threw around was that they create a small portion of the set and then we extend it. But what ended up happening is, there were five different sets day and night, and the practicality of building pieces of five different sets with different greenscreens attached in the stage space that it would take to make an extendable set, actually made it more practical to do 100% virtual sets for the ship interiors. Which was a leap of faith on their part. So we ended up doing between 125 and 150 interior shots: everything you see inside the mother ship is a greenscreen virtual set and that's continuing in the series as well.

BD: What have been some of the biggest challenges?

AO: On the design side, another thing we did that was challenging had to do with mechanics. In the script, it mentions that something starts happening on the bottom of the mother ship, some panels start to flip and an image appears on the other side of the ship. Now, exactly how that happens and the exact mechanics of that kind of event was something that we had to figure out. So there was quite a bit of animanics and conceptualizing to get this panel flip effect to make it look realistic and part of the alien technology. And we found that you really had to see it from several different perspectives: from the person on the ground and also from up close to get an idea of the real technology that was happening.

So for those two types of shots we came up with a macro design and a micro design for the effects. And the macro design was to have the armor plating underneath the ship detach and move from the surface and reconfigure to make one smooth bottom surface and then have all the smaller sub-plates in between flip over in a close-up. So it wound up being a two-part effect.

BD: What tools did you use for this?

AO:

All the mother ship and shuttle stuff was rendered in LightWave; a lot of it was modeled in modo and textured in Photoshop. We've been using LightWave successfully in a lot of our hard surface shows and we really utilized it with our CG supervisor, Chris Zapara, who was responsible for making the panels flip and also getting the look and feel of those ships in space correct. And also Steve Graves, who is our 3D modeler. We also used Pierre Drolet, who is the modeler from Battlestar. He did the shuttle.

The F-16 crash, which was handled by Zoic LA, included a lot of modeling, texturing and particle work along with virtual NY buildings comp'd in.

BD: What else have you been working on?

AO: There's this F-16 plane crash at the very beginning, which is one of those shots where you're just basically looking at a blank plate and then creating an F-16 crashing into the ground. So there was tons of modeling, texturing, particles and particle fire, flame elements from our library, smoke elements from our library layered in there and also quite a bit of work sweeping out Vancouver mountains and buildings, because it's filling in for New York, and replacing them with New York buildings. This was quite a big job with shots you don't even notice in the pilot and then continuing into the series.

BD: How was the work divided between the LA and BC office?

AO: We split up mostly the ship shots and F-16 crash here at Zoic LA; we did all the design work here at Zoic LA; and they did all of the virtual set work up there because it was so essential to be close to production. Trevor Adams was the artist who did a lot of the mental ray virtual sets…

BD: What distinguishes the work on V from Battlestar?

AO: The main difference is that this is much less action-oriented. There's none of the World War II documentary cam stuff that's in Battlestar. This is much more about the awe and wonder of an alien race and technology coming to earth. So what that means is we don't rely on the camera work to carry us through a sequence. The shots are way longer here and we get a much closer look at the model and you a lot more detail. So it's very challenging when you see the kids in the shuttle going up into the mother ship for the first time. These are really long shots and we linger on the CG aspects for a long time in very fluid, almost David Lean-style reveal shots. And it was really important that we sell the emotional sense of: "Oh, my, God! This is the coolest technological thing you could ever imagine!" The actors are doing their part to help sell it but we needed to do our part to make it seem all that awe-inspiring.

The interiors consist of several virtual environments utilizing mental ray and optimized for realtime playback in collaboration with Lightcraft.

BD:What can we expect going forward?

AO: As we went into series and we knew that we were going to be doing an even larger volume of virtual set work on even a more compressed schedule, we partnered with a company called Lightcraft to do realtime, on-set previsualization of our sets. And with them we've developed a system called ZEUS (Zoic Environmental Unification System). We kind of gave it a tricky name. What we've been doing is optimizing our mental ray sets for realtime playback and the Lightcraft system is a combination of realtime camera tracking that reads a series of custom tracking markers up on the ceiling of the stage, with a lipstick witness cam in combination with a gyroscopic element that gives the rotation data. And feeds that realtime camera data back to a box that feeds it into the realtime rendered version of our virtual sets. And then also composites the greenscreen that's coming through the camera feed in realtime. So we can see a really good approximation of all the mental ray lighting on set in realtime. So it helps the DP light to the virtual set with much more fidelity and knowing exactly where the key and fill lights are in a scene, and it helps the actors in their staging so they can get closer to the elements of the virtual set like the walls and doors without actually walking through them. Stuff that just wouldn't have been possible with more traditional virtual set technology. So we take the data into our proprietary system and when we get the EDL file from editorial, we're able to sync up with the work we do at Zoic on the back end, and generate a 3D layout scene and generating Nuke comps automatically through scripting. So they start with quite a lot of leg work done, which is essential for getting the huge amount of shots done in the short amount of time that we have.

Bill Desowitz is senior editor of AWN & VFXWorld.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags