V for Awe-Inspiring VFX
AO: The main difference is that this is much less action-oriented. There's none of the World War II documentary cam stuff that's in Battlestar. This is much more about the awe and wonder of an alien race and technology coming to earth. So what that means is we don't rely on the camera work to carry us through a sequence. The shots are way longer here and we get a much closer look at the model and you a lot more detail. So it's very challenging when you see the kids in the shuttle going up into the mother ship for the first time. These are really long shots and we linger on the CG aspects for a long time in very fluid, almost David Lean-style reveal shots. And it was really important that we sell the emotional sense of: "Oh, my, God! This is the coolest technological thing you could ever imagine!" The actors are doing their part to help sell it but we needed to do our part to make it seem all that awe-inspiring.
AO: As we went into series and we knew that we were going to be doing an even larger volume of virtual set work on even a more compressed schedule, we partnered with a company called Lightcraft to do realtime, on-set previsualization of our sets. And with them we've developed a system called ZEUS (Zoic Environmental Unification System). We kind of gave it a tricky name. What we've been doing is optimizing our mental ray sets for realtime playback and the Lightcraft system is a combination of realtime camera tracking that reads a series of custom tracking markers up on the ceiling of the stage, with a lipstick witness cam in combination with a gyroscopic element that gives the rotation data. And feeds that realtime camera data back to a box that feeds it into the realtime rendered version of our virtual sets. And then also composites the greenscreen that's coming through the camera feed in realtime. So we can see a really good approximation of all the mental ray lighting on set in realtime. So it helps the DP light to the virtual set with much more fidelity and knowing exactly where the key and fill lights are in a scene, and it helps the actors in their staging so they can get closer to the elements of the virtual set like the walls and doors without actually walking through them. Stuff that just wouldn't have been possible with more traditional virtual set technology. So we take the data into our proprietary system and when we get the EDL file from editorial, we're able to sync up with the work we do at Zoic on the back end, and generate a 3D layout scene and generating Nuke comps automatically through scripting. So they start with quite a lot of leg work done, which is essential for getting the huge amount of shots done in the short amount of time that we have.
Bill Desowitz is senior editor of AWN & VFXWorld.