Alain Bielik explores how Zoic Studios made the transition from TV to features with Serenity, including the challenge of creating a new pipeline dubbed Zlogic.
Usually, for a television series to reach the big screen, it requires several successful seasons and many years of intense fan activity. For Firefly, all it needed was a mere nine episodes, a huge success on DVD and the relentless energy of its creator, Joss Whedon. Only three years after its brutal cancellation by FOX Television, Firefly hits the local multiplex Sept. 30 as Serenity, a Universal feature film release directed by Whedon. Fans of the original series wont be disappointed as spaceship Serenity and her eclectic crew are all back for a deadlier-than-ever mission: transporting two passengers that happen to be the most wanted fugitives in the galaxy
As soon as the movie was greenlit, Whedon turned to long time collaborator Loni Peristere to supervise the ambitious visual effects effort. A founding partner at Zoic Studios, Peristere had already produced thousands of effects shots for Whedons TV shows, including Buffy The Vampire Slayer, Angel and, most especially, Firefly, for which he had won an Emmy Award in 2003. This nine-year long collaboration was about to reach its peak with the feature film version of Firefly.
From TV Work to Feature Film Project
Serenity called for about 400 shots, with Zoic Studios producing 212 of them. The facility led the design for the entire film under Whedons direct supervision. However, as production progressed, it became clear that Serenity would be better served by splitting up the film among several vendors. At this point, John Swallow, evp of production technology for Universal, and overall visual effects producer Juliette Yeager brought in Rhythm + Hues (R+H), Illusion Arts and Perpetual Motion Picture (PMP) to help on the film. R+H provided 30 shots under Bud Myricks supervision, while Illusion Arts focused on several sequences - including a miniature crash realized by Grant McCune Design - with Bill Taylor and Syd Dutton overseeing the project. Finally, PMP and Richard Malzahn produced many composites in collaboration with Zoic Studios.
We were very fortunate to have these very talented companies on our picture Peristere comments. Thanks to Joss and Johns respect in the community, they were able to do a great deal with very little money. The movie is bigger and better because of their efforts.
R+H provided 30 shots under Bud Myricks supervision, including the Mining Camp, the Black Room, the Basement Generator and the Funeral sequences. Illusion Arts and Syd Dutton created about 20 shots, including Mr. Universes world, the Ion Cloud, Inaras world and Beaumonde. CG models were built in Maya and XSI while compositing was carried out in After Effects. Illusion Arts also created the 3D introduction to Mr. Universes world and the 3D environments seen during the crash of the Serenity. This spectacular sequence was realized in miniature by Grant McCune Design and photographed by Bill Taylor of Illusion Arts. Finally, PMP and Richard Malzahn produced many composites in collaboration with Zoic Studios. We were very fortunate to have these very talented companies on our picture, Peristere comments. Thanks to Joss and Johns respect in the community, they were able to do a great deal with very little money. The movie is bigger and better because of their efforts.
For Zoic Studios, it was obvious from the beginning that Serenity was going to be their largest project ever. Although the company had completed sequences on Van Helsing, The Day After Tomorrow and Spider-Man 2, they were finite sequences awarded in post-production. On Serenity, Zoic was the lead vendor and had to plan accordingly. Part of this plan included acknowledging the companys limitations and planning the film production in its best interests.
My partner Chris Jones and I wanted to let Joss and the studio know that the movie came first and that we would not try to accomplish more than we could deliver, Peristere remarks. Ironically, when we bid the show, we bid 212 shots, and that is exactly where we finished, but for less money. This was due to an efficient workflow at the start of the project between Joss, production (Juliette), the studio (John) and Zoic. Joss made a point to be precise with shot design and execution, which gave Juliette the ability to retool the budget and produce more shots. Plus, we had all these years of experience with Joss. The crew knew what he liked. They knew how to build, light and shoot Joss way. This made for better budgeting for the show, as we were counting on a clean creative.
Upgrading the Whole Company
Producing visual effects for feature films is quite different from creating effects shots for television. While preparing for Serenity, Zoic Studios had to revamp its whole organization and workflow. As Firefly was set up for NTSC, in-house visual effects supervisor Randy Goux retooled the entire pipeline for 2K with help from Chris Jones, Saker Klippsten and 2D supervisor Patti Gannon. This included color space, viewing, rendering, shading, storage and compositing a process that took upwards of six months.
The management structure also changed. Key artists, who had executed the series literally by hand, were elevated to sequence supervisory positions. After single-handedly producing entire shots on their own, they suddenly found themselves managing new departments: animation, lighting, effects, etc.
Zoic Studios quickly learned that the main advantage of working on a feature film as opposed to television work could also be its main drawback. On Serenity, we had far more time to create the shots than we ever had on Firefly. This was both good and bad, Peristere notes. The good thing is, with more time, you can focus on all aspects of a shot. The bad thing is, with more time you can focus on all aspects of a shot When you have time, everything can be better, but there is a danger in it, as it allows you to experiment, and for every formula, there are many hypotheses that fail. On television, you are forced to stick to the plan. As newcomer to time, we learned to look for the middle road. Time is a luxury that should be cherished and tuned to the single mission.
Setting Up New Pipelines
At the heart of Serenity is the spaceship of the same name. The model that had been developed for Firefly was rebuilt from scratch for Serenity, as it had no UV maps and the textures were simple NTSC projections. All the vehicles and CG environments too were built in Maya by model supervisor and ESC alumnus Brian Friesinger, using quads and shot-centric UVs, and textured by Lance Powell by hand. Once the animation was complete, the scenes were ported over to LightWave to be lit and rendered. Compositing was then carried out in Combustion.
One sequence required the development of a different pipeline. In a land-based chase involving two high-speed hovercrafts, one of the vehicles bleeds a filthy black smoke that was generated using a proprietary particle system. The same fluid system was later used to create the dust wake when Serenity drops out of the sky to pick up the good guys hovercraft. The chase was shot over a highway with actors maneuvering a hovercraft set piece mounted on the side of a truck. The environment was later replaced with a Maya terrain system written by Brian Goldberg and Marcus Stokes. The bad guys hovercraft was mostly realized in CG. For this sequence, Peristere decided to switch to a Maya/mental ray pipeline, specifically to take advantage of the fluid and particles simulation tools of the Alias animation package.
This new pipeline marked a major departure from the LightWave/Flame/After Effects platform that Zoic had been using for television work. LightWave has been a cornerstone here at Zoic, but it just needs some more refinement for high resolution work, Peristere adds. Although we had used it to its best on Battlestar Galactica and Firefly, the guys had to do far too much patchwork with it and it stressed them to their limit. They made it work, but it was brute force and they are the best users in the business. We also faced a resource issue. Between Zoic, Eden FX and Digital Domain, the key LightWave talent was booked up. There simply wasnt anyone we felt comfortable with for these shots. Since Randy Goux had a team of super Maya artists from ESC who were available to do this work, we pulled the trigger.
Integrating mental ray
The creation of the new pipeline, quickly named Zlogic, became the greatest technical challenge for Zoic on Serenity. At the start of the film, we had no tools to support mental ray, Peristere observes. Randy and his team had to create this support from scratch. We are still refining it today. In the end, it became apparent that the Maya/mental ray platform was more flexible and robust than LightWave, and as a result, Zoic is moving in this direction for its general pipeline for 2K and above.
R&D supervisor Steve Avoujageli one of the ESC alumni oversaw the implementation of the new pipeline: mental ray allowed us to tackle some of the biggest challenges on Serenity by utilizing its exporting abilities and backward integrating them into Maya. For assets that would cross multiple shots or sequences, we would export dependency scene files for materials, geometry and other scene dependencies, and then crosslink them into our lighting scene. This ability helped us assemble complicated scenes while automatically maintaining consistency and updates from the dependencies. Currently, we are expanding this methodology to allow us greater batch script compositing power in shot assembly by writing our own processing and compositing nodes at the lowest level.
Rob Nitsch and I also developed a series of tools for creating smoke/dust/clouds in the chase sequence as well as in many other shots. Some of these tools included predetermined velocity mapping, eddy rotation and particle caching tools. Further on in the production, we began to develop expanding fluids and raymarch capabilities between Maya and mental ray, which allowed us to further push the look of the effects in Serenity.
For shading that was done in mental ray, Avoujageli wrote a library of native shaders that pushed the shading graphs further than the artists could before. This included a fully redefined bump and displacement library; a complete math node set for color/vector/scalar math and an extended library of illumination models with frame buffer support for passes.
A Labor of Love
Serenity features a spectacular space battle sequence that logically became the most render intensive of the film. Since the shots were completely computer- generated, the number of elements and passes turned out to be quite overwhelming. In order to avoid excessive render time, the artists created a balance between model and texture resolution based on the level of detail required for each camera angle. This allowed the renders to flow through the RUSH-based render farm without compromising image quality.
In the end, the cult status of the original series and the reputation of its creator contributed to make the production of Serenity s visual effects a real labor of love. This personal investment from all involved allowed the movie to come out with far higher production value than Whedon could ever have hoped for, given his budget. It is true that Serenity was challenged by budget, Peristere concludes. It forced us to be challenged by technical and artistic feats. We needed to achieve a high level of quality with small efficient teams. Sometimes, one man would do it all: for example, Rob Nitsch created all the fluid effects in the film, while Kyle Toucher animated 60% of the all CG shots. In the end, I see Serenity as a complement to the work that we did on Firefly.
Alain Bielik is the founder and special effects editor of renowned effects magazine S.F.X., published in France since 1991. He also contributes to various French publications and occasionally to Cinéfex. He recently organized a major special effects exhibition at the Musée International de la Miniature in Lyon, France.