Greg Singer muses on the contribution of the animation community in promoting themes of peace and cooperation, as exemplified in films such as Azur and Asmar, The Iron Giant and the Hiroshima International Animation Festival.
More than 50 years after it first was a fad, 3-D finally appears ready to fulfill its potential. Once reserved for B-movie gimmicks and theme-park rides, stereoscopic cinema is poised to become a big deal, with top filmmakers and studios lining up to explore a territory finally refined enough by digital technology to deliver a unique theatrical experience.
The basic technology of 3-D remains as simple as ever: Two images are projected on the movie screen -- one for each eye -- that, when viewed with filtering glasses, create the illusion of depth.
Digital production and exhibition are making it possible for filmmakers to take 3-D to the next level, drawing the attention of such filmmakers as Robert Zemeckis, James Cameron, Steven Spielberg and Peter Jackson to the medium as a way to take the blockbuster to the next level.
But while CG vfx and animation are uniquely suited to generating stereo images, the relatively simple concept of stereoscopic filmmaking serves up some complex creative and technical challenges in the production pipeline.
"Everything is conceptually doubled," says Buzz Hays, senior vfx producer at Sony Pictures Imageworks, currently readying Zemeckis' performance capture CG adaptation of Beowulf for a Nov. 16 release in three separate 3-D formats -- Real D, Dolby 3D Digital Cinema and IMAX 3-D -- for the largest launch to date in 1,000 theaters.
"Because the movies are similar, you can take advantage of certain properties to make things cheaper, but nevertheless it's two movies."
Incorporating 3-D into production pipelines brings with it issues such as organizing and tracking extra data, educating artists on the technical aspects of solving problems in 3-D and the creative issues involved in the process, setting up an accurate review and approval procedure, developing new software tools, how the extra work affects staffing levels and schedules and preparing a film for exhibition in different formats.
Preparation, though, is the first key to effectively incorporating 3-D into a production pipeline: "You have to give a lot of thought at the beginning of a film as far as how you catalog and store your elements so they can be retained when you come down to the actual 3-D process," says Ed Jones, producer on The Weinstein Co.'s animated Escape from Planet Earth.
While live-action 3-D has historically been created using two cameras, digital techniques allow the second image to be extrapolated and controlled by the computer.
For Avatar, Cameron's upcoming performance capture CG feature that is being shot in 3-D, Rob Legato designed a system that creates extensive metadata files for each shot, capturing not only the performance but also the location of the camera as well. "The 3-D pipeline is already created because as soon as you shoot you have two files and the two files are combined in various different ways to be viewed," Legato adds.
The system, which Legato describes as director- and DP-centric, allows Cameron to freely shoot multiple on a stage with actors and cut together sequences immediately in 3-D for approval before turning shots over to the vfx house (Weta Digital) with metadata linking the edit with the correct takes.
Bret St. Clair, vfx supervisor at Meteor Studios in Montreal, which is handling effects on next year's Journey 3-D (from Walden Media to be released by New Line), agrees that up-front organization is essential for ensuring accurate tracking info for each shot.
Meteor began by tracking one eye of the live-action photography and assuming the metadata from the shot was accurate for the other eye. That turned out to not be the case, and they found alignment and distortion issues that had to be solved before any vfx work could be done. Better software helped as the show progressed and Meteor was able to use new stereo tracking abilities in 3D Equalizer.
St. Clair says such a learning curve was typical of the process. "The first couple of months expect to spend about 200% of the time you'd normally spend. As you move through the show and people get more adept at dealing with the problems, it goes down to around 10 or 20%."
Frantic Films producer Ken Zorniak says that rendering turned out to be quite different on Journey, and a system was created that would let both eyes be rendered as a separate pass. With no need to load elements more than once, "our renders, instead of taking twice as long, were taking kind of a percentage less," he explains.
Zorniak says compositing was another challenge. "We wanted to try to keep it as relatively auto-generated as possible," he says. "If you're going to affect one eye, you want to make sure you're doing the exact same thing to the other eye and you don't want to lose it in a separate file." That led to putting the frames for both eyes into the same compositing file.
Setting up a pipeline takes some innovation, as not all the tools you need are readily available yet, says Jim Mainard, Co-Head of Production Development at DreamWorks Animation, which is prepping Monsters vs. Aliens for 2009 as the first in a long line of 3-D features. "One of the largest voids of technology is in the editorial area. That's an area where we've had to do a lot of work to construct tools." Mainard says they came up with a way to cut one eye on the Avid, have the cuts apply to both eyes and output to both eyes without the machine being aware of the second eye.
Jason Clark, Co-Head of Production Development at DreamWorks Animation, says it was essential to create previs tools for the show's artists so that it would be designed with 3-D in mind. "When you design and author the movie for 3-D in the beginning, you get the benefit of not having to re-edit or re-figure the movie as much as you would potentially going from a monoscopic movie to a 3-D movie."
Kevin Tureski, director of product development for Autodesk, says the company is working closely with its clients to add features and hooks to Maya that will assist with 3-D. "They've asked us to extend Maya in ways that will allow them to build in-house tools or plug-ins to Maya that will make it easier for them to preview and block out in 3-D," he adds. Examples include tools for measuring the depth of objects relative to the screen, and adjusting the inter-ocular distance.
Another major issue is review and approval. Artists need to see and work on their shots in 3-D, and be able to see how the final result plays on a large screen. For most houses, that means adding more large-screen solutions for viewing because it's difficult to judge how 3-D plays on a big screen from looking at a computer monitor.
"Size matters," says Mainard. "People get used to looking at a monitor and knowing how it's going to look on film. We have to get them used to looking at depth on a monitor and knowing how it's going to look on a big screen."
And there are creative considerations to working on a 3-D film that affect the production pipeline. Most important is the factor of audience comfort. The quick cuts that work in a 2-D film would be jarring in 3-D as it forces the audience to refocus on objects in an unnatural way. That requires softer dissolves, longer shots that give the audience time to figure out the geography of a scene and a need to avoid action that breaks the frame and hinders the 3-D effect.
"It's partly technical, but partly an artistic process to figure out what's going to look good to the people in the back row but not kill the people in the front row," says Rob Engle (Open Season, Monster House, The Polar Express), senior computer graphics supervisor, digital production, Sony Pictures Imageworks.
Other 2-D techniques such as using depth of field to focus the audience's attention on a person or thing in the frame are handled instead by the 3-D effect. That means backgrounds that normally would be out of focus in 2-D need to be fully detailed.
"We tend to sharpen up the backgrounds so you see more detail," says Engle. "So if anything we're probably being more careful on 3-D than on 2-D because we don't need to direct your eye anymore because if it's physically closer to you, you're going to look at it."
Also, effects such as smoke, rain or snow can no longer be simulated with 2-D effects because they will look flat in 3-D.
St. Clair also says there was a need to correct convergence issues and ghosting, which occurs with high-contrast images where not all the light is filtered out by the viewer's glasses. "Sometimes, it's the lesser of five evils so it's not a glaring artifact."
Roto and paint also requires a high level of precision. "You can't be sloppy at all with painting a pixel into a stereo film," St. Clair continues. "Put that pixel slightly to the left or right and it's not going to work. That's a huge impact, so a lot of things you could rip through in other places is a much more tedious task and it takes a lot of viewing and a lot of diagnosing."
For the most part, all this extra work doesn't require a huge increase in crews or schedules. Most crews generally have to grow anywhere from 10% to 20%. Schedules can remain roughly the same as on 2-D films, though there is less flexibility to solve a last-minute crunch by farming out work to other houses.
Planning also come into play when considering the final format. For the most part, Real D and Dolby, which are both digital, require little specific attention aside from the occasional depth tweak and color timing. "It's not like we're rendering out two completely different versions of the movie," says Hays. "It's essentially a post-production process."
IMAX is a slightly different story, as the large-format screen has a significant impact on the experience that requires up-front consideration of how to handle the placement and use of 3-D effects.
Hugh Murray, VP of technology for IMAX, says the format is ideal for 3-D because of the geometry of its theaters. "We actually use the whole theater volume," he says. "All the audience has a wide angle view of the screen."
IMAX also has its own pipeline for converting 2-D to 3-D, as it did last year with Superman Returns and this year with Harry Potter and the Order of the Phoenix. Murray says the company uses a suite of proprietary software to take parts of the image and attach them to 3-D models that give them shape and volume. That creates some difficulties where the 3-D will require some degree of "filling in" objects around the edges to make them work. Most of that work is done manually, through Harry Potter has sparked ideas for new tools that will speed up the process, he says.
Still, from a production standpoint, it's rare that the IMAX version would need special creative or technical adjustment that deviates from the other formats, says Hays.
"The process is identical, it's just going to be perceived differently because of the size of the screen," Legato suggests.
The process is expected to be in flux over the next few years as new technologies are developed and refined and the use and acceptance of 3-D in films grows. "We're constantly finding efficiencies in the process that allow us to make for a better and better 3-D experience without any sort of compromise," says Hays. "If anything's been compromised, we're doing it the wrong way."
Thomas J. McLean is a freelance journalist whose articles have appeared in Variety, Below the Line, Animation Magazine and Publishers Weekly. He writes a comicbook blog for Variety.com called Bags and Boards, and is the author of Mutant Cinema: The X-Men Trilogy from Comics to Screen, forthcoming from Sequart.com Books.