AWN and VFXWorld editors Bill Desowitz, Sarah Baisley and Rick DeMott attended the VES Festival of Visual Effects, reporting back on secrets to the magic that were on display.
For its 2006 Festival of Visual Effects, the Visual Effects Society moved its yearly event to Hollywoods Egyptian Theater. This years expanded festival included special screenings and exhibit booths set up in front of the venue, giving attendees additional things to fill their time between the various panels.
In Virtual vs. Real Sets: Combining Production Design With Visual Effects, production designers discussed the digital impact today and how it has broadened the collaboration between directors, cinematographers, production designers and vfx supervisors.
Alex McDowell, who is at the forefront of previs and the non-linear, digital workflow, marveled at how visual tools since 2000 have become financially accessible. They represent the full arc of 3D space. Its a much more visceral experience for the production designer who has worked with David Fincher (Fight Club), Steven Spielberg (Minority Report, The Terminal) and Tim Burton (The Corpse Bride and Charlie and the Chocolate Factory). You get to inhabit space that hasnt been built yet and fix things far in advance of production.
Jim Bissell described how the upcoming 300, based on Frank Millers graphic novel about the 480 B. C. Battle of Thermopylae, goes even further than Sin City. Seventy percent consists of painted backdrops shot in industrial space. They were able to save money by manipulating images: creating abstract terrains and armies of 3,000 Greeks. They took the DI and crushed it to desaturate the look to resemble the graphic style. They also painted shadows into the sets to instill an operatic quality.
Jack De Govia discussed how a low budget crime drama such as the upcoming Anamorph (supervised by Richard Edlund) benefits from higher level digital work, keeping him on target. Meanwhile, John Myhre, who worked on Memoirs of a Geisha, was able to add all kinds of visual elements because of the digital tools available.
The VES first international panel, VFX Without Borders, moderated by vfx supervisor Van Ling, featured Tarun R. Agarwal, joint managing director/vfx director, Rajtaru Studios, Mumbai & Dubai, India; Kristijan Danilovski, founder/vfx supervisor, FX3X, Skopje, Macedonia; Franck Malmin, td and vfx supervisor from Def2Shoot in France; Carlos Iturriaga, vfx supervisor, Ollin Studio, Mexico City, Mexico; and Lifeng Wang, founder/president, Eastar, Xing-Xing Studios, Beijing, China.
They each discussed their specialties, local vfx communities and how they are trying to grow their business, emphasizing that expanding the global community is in the interest of all vfx artists.
Iturriaga said Ollin Studio has expanded from Mexico City to Los Angeles with the opening of a new office as a result of ads and feature work with Warner Bros. Ollin specializes in vfx, previs, concept design and DI. The studio uses Maya RenderMan, Flame, Inferno and Shake. Iturriaga showed a commercial reel, including a Corona beer commercial described as Lord of the Rings meets Mariachi. Interestingly, he suggested that Mexico is actually losing digital artists to the U.S.
In terms of Macedonia, Danilovski said FX3X handles vfx, roto, paint, 3D tracking, compositing, CG animation and MoCap. The work entails commercials, invisible vfx and pipeline sharing with other studios.
Def2Shoot looks at the U.K./U.S. model in specializing in CG, mattes, animation (for music videos) and other work. They exchange techniques and artists with partners and they use game industry techniques for animation and rigging. They use Autodesk and IBM and work with students from Gobelins.
In discussing the state of vfx at Rajtaru Studios and throughout India at large, Agarwal confirmed that language is still a barrier that prevents Western studios from hiring them, but that they offer a pool of software developers and are producing better quality 3D work. In particular, Rajtaru Studios offers modeling, lighting, texture maps and greenscreen.
At Eastar in Beijing, the industry is small for vfx but theres a lot of TV work. Plus the studio recently did 16 shots for Fantastic Four comprised of onscreen graphics and did The Long March documentary for The History Channel involving classical painting style for landscapes.
Looking Ahead at the Future of Visual Effects Tools, moderated by Darin Grant, who heads production and technologies at DreamWorks, focused on GPU and realtime rendering, new tools for post control, opening up platforms, remote graphics software and the virtual director.
Eric Enderton, senior software engineer at NVIDIA, discussed the convergence of film and game via the GPU as a fast parallel computer. While weve already witnessed breakthroughs in realtime rendering, sim work, Rigid Body Dynamics and Collision Detection, Enderton suggested that more and more vfx tasks would soon be accelerated by the GPU. Eventually, this would push into more interactive tools.
Paul Debevec, the esteemed research associate professor at USC and the exec producer of graphics research at the universitys Institute for Creative Technologies, offered his ongoing work on integrating live action and 3D CG. Performance Relighting and Reflectance Transformation With Time- Multiplexed Illumination involves a technique for capturing an actors live performance in such a way that the lighting and reflectance can be modified in post. Using the new domed Light Stage 6, Debevec is trying to capture the whole body and is working on virtual viewpoint control with a vertical array of cameras. He still needs to capture arbitrary motion and deal with data flow.
Richard Kerris, who heads technical marketing for Apples software applications, discussed open standards in which programmers are becoming artists and vice versa, and extensible tools for post, offering such examples as Luma artists programming a pipeline and workflow made easier by Apple products, and director David Fincher benefiting from the same on Zodiac.
Terry Brown of HP sales explained how they are enabling 3D artists to virtualize everything through remote graphics software with compression technology. They are part of a new production paradigm that is security protected, which is being used at DreamWorks and other studios.
Habib Zargarpour, senior art director at Electronic Arts, dazzled the audience with glimpses into the future of UI, stating that realtime interaction is the Holy Grail. He showed how he could instantly light a shot from Need for Speed Most Wanted and previewed a realtime CG shot of a fetus created on a game engine platform from Trapped Ashes, a horror anthology involving Matrix vfx supervisor John Gaeta and pioneered by Matrix color & lighting td Rudy Poat. Zargarpour said these innovations will lift technical obstacles to create better artistic decisions.
The second day kicked off with a very visible look at the Invisible Effects: The Da Vinci Code & Casanova. This panel took an in-depth look at two recent films that have recreated eras and locales not accessible to their filmmakers, moderated by Ian Hunter, vfx supervisor at New Deal Studios.
Amongst the many recreated environments in The Da Vinci Code, it was particularly challenging to recreate an entire CG environment for the interior of the Saint Sulpice church, working from some photos taken by overall visual effects supervisor Angus Bickerton. Chris Burns, vfx supervisor at Double Negative, said things were measured and relayed to artists in Angus feet, since the vfx supervisors feet measured 11.99 inches long; so he walked off and recorded the dimensions. They wound up with about four feet of error when they mapped it out.
There werent nearly enough photos as artists took more of a multiplane approach instead of 3D in this growing photogramatory process. They had 100 photos to work with, but they needed about 14,900 more. Plus the angles did not match up with how the director planned to shoot it, according to Mark Breakspear, vfx supervisor for Rainmaker.
Bill Taylor, ASC, vfx supervisor/cinematographer, Illusion Arts, recounted what a challenge it was to create the entire environment for Casanova in CG. It was done in a made scramble on a tiny budget (that could not afford live-action shoots). Actors were captured and then recreated in different costumes to create crowd scenes.
The panel agreed that working in greenscreen was preferable to bluescreen because green gives the highest luminance and works better in sunlight, said Breakspear. That and blue is the noisiest channel with light registration, added Burns.
Next up, the vfx team on the latest X-Men franchise discussed the challenges of working on a tight schedule (April 20, 2005-April 8, 2006) and with various vfx houses in five countries on three different continents. Panelists were John Bruno, vfx supervisor; John DJ Des Jardin, vfx supervisor; Bryan Hirota, vfx supervisor, CIS and Edson Williams, vfx producer for Lola Visual Effects with Hunter moderating.
A main focus was the rejuvenation work Lola did to strip off 20 years from the actors. Williams said their digital makeup often involved 200 layers as they stripped shadows and restructured the faces of Ian McKellan (Magneto) and Patrick Stewart (Prof. Charles Xaiver). They collected photos of these well-known actors from their younger days. Then, primarily using 2D solutions, filled out their loss of bone density and muscle that comes with the aging process. In fact, they took them back too young and had to add shadows back to make them a bit older.
Lots of digital makeup was also used on the Jean Grey/Phoenix (Famke Jannsen), especially for the finale.
Bruno said all the actors were digitally scanned (a common practice in most films today) in case they had to replace them at any one point, especially with the flying bodies and shattering body parts in a digitized room.
They also lamented they had to remove some of the debris used in the stormy, final scene (assets were reused from Weta Digital just coming off King Kong) and played the vfx teams favorite version that higher powers asked them to remove and simplify.
Bruno said the biggest lesson learned was never to try to do a picture in five countries in one year.
Awash with water effects and groundbreaking work worthy of its own entire day of explanation and examples was the panel on Poseidon A World Turned Upside Down. Vfx supervisor Boyd Shermis (panel moderator) turned to multiple sources on two continents (including ILM, Moving Picture Co., CIS and Giant Killer Robots) to produce more than 500 shots.
Instead of using a model, the 1,200-foot liner was created entirely in CGI, particularly so that the camera could always be in the sweet spot for the action.
ILM relied on and worked with Stanford University on its new generation of Physbam simulation system to create the ocean, complete with turbulence and bubbles. Artists particular wanted to capture how water scatters light through its volume. ILMs Mohrn Leo, assoc. vfx supervisor oversaw the water simulation and rendering on the movie. Kim Libreri, vfx supervisor at ILM, said the rendering really taxed the studio, using 70% of ILMs capacity, as the rendering demands were also farmed out to Stanford and whatever resources they could muster. It was the single greatest demand ever put on ILM, which had topped out at roughly 40% on previous projects.
Steve Moncour, CG supervisor for Moving Picture Co., said his place photographed the living daylights out of the set, as well as people and costumes to be recreated for dynamic action scenes. MPC used a combination of motion-capture motions and PAPI to create the animation of things breaking and people reacting, rolling and falling.
Bryan Hirota, vfx supervisor for CIS, used a laser scan of the set so they could expand the galley and hallways to give the perspective of a ship that long.
It was impressive to see the various attempts at rolling the ship digitally as the artists followed a real-life example caught on a TV documentary in which a camera crew had photographed a freighter that had rolled and sank in the Pacific, not far from Australia.
The evening culminated with A Look Back at Aliens 20 Years Later. On hand to discuss the 20th anniversary of the release of the science fiction action classic Aliens, were moderator Paul Taglianetti, vfx producer/supervisor; Bob Burns, Aliens archivist; Alec Gillis, creature fabricator, Stan Winston Studio; Pat McClung, vfx miniature supervisor; John Rosengrant, creature supervisor, Stan Winston Studio; Dennis Skotak, vfx co-supervisor and dp; and Robert Skotak, vfx supervisor.
Festival goers were also treated to an opening night party, complete with a big jazz band, as well as a series of showcase screenings running concurrently with the panels on experimental films, animated shorts, new and international shorts and the ACM/SIGGRAPGH presentation of The Story of Computer Graphics featuring CGI pioneers.
The final day of the 2006 VES Fest kicked off with a panel of effects legends, talking about the growth of the industry from stop-motion to CG. Moderating the panel was cinematographer and founding member of the VES, Bill Taylor. On hand to share their experiences were: Chiodo Bros. Prods. founder, Stephen Chiodo; Randy Cook, visual effects supervisor on Lord of the Rings and King Kong; Tim Johnson, director of Over the Hedge; T. Dan Hofstedt, animation supervisor on Monster House; and visual effects legend Dennis Muren, whose credits include Star Wars, Jurassic Park, The Abyss, Terminator 2 and War of the Worlds.
The panel was in consensus that smooth stop-motion animation was a huge leap forward in movie history. The new technique allowed filmmakers to create more fanciful tales on larger scopes without relying on the limitations of what a man in a suit and make-up could convey. The next huge leap forward in film visual effects came with the advent of computer animation, which allowed for smoother and more realistic animation. However, some of the panelists still hold a love of the pops and shutters of handmade stop motion. Muren said that the imperfections created a sense of awe in the viewer, reminding us that there was an artist creating the movement.
Yet, Cook expressed his love for the move to CG. The biggest benefit he admits was the removal of all the nasty stuff. For him, this nasty stuff included the amount of time it took to simply make a creation jump from one place to another. Between wires and rigs the time was extensive for a movement that wasnt expressing any emotion, but was just trying to get the character from point A to point B. Additionally, Cook is excited with freeing qualities of CG technologies, which will allow smaller groups of filmmakers the ability to make their own personal tales.
When looking into the future, the panel agreed that the age of stop motion used as a special effects method in live-action films is over on any large scale. However, with new technology that allows stop motion animators to create movement more easily without the use of hidden wires, the art form has the potential to grow in new directions, giving an avenue for artists to explore more fanciful tales.
Johnson sees the future of computer animation going into two different directions. There will be more of a demand for photoreal animation for live-action films while animated films, no matter whether the production method is CG, stop motion, 2D or even motion capture, will move into more stylized looks.
When challenged by an audience member on whether Monster House should be considered an animated film, Hofstedt replied that the film presented a new, exciting area to explore. Johnson defended the motion capture techniques as just another style of creating films no different than live-action, stop motion or 2D. Both Hofstedt and Johnson stressed that one should judge a film on whether the method served the story.
In the end, no matter what style one uses, Johnson probably summed up the feelings of all involved in animation the best when he said, the drug of animation is seeing a character move.
As for the next panel, which highlighted Pixars work on Cars, the animated characters sure moved emotionally that is. Like all of the previous Pixar films, animator Travis Hathaway stated that the focus on the film was on character and story.
Hathaway focused on how director John Lasseter wanted the cars to remain cars. Some exaggerated movement was used to enhance performance, but Lasseter really wanted the world and its characters to look real.
This push to more realism made Cars Pixars most complex and details film to date. The film was 450 times more complex than Toy Story with the average rendering time per frame at 17 hours.
The Cars panel started off with rendering supervisor Jessica McMackin, giving a detailed presentation of just how complex the production was. Some frames took as much as 600 hours to render with 15% of the shots running out of memory. An average film grade shot for the film was 9G. The 145,000-frame film took three million CPU hours to complete.
The waterfall was the most complex of all the shots. The actually waterfall was composed of four separate components the falling water, water splashing off the rock walls, water hitting the river below and the mist that was subsequently created. The shot was so detailed that it could of brought down all of the computers at Pixar if the render team was careful.
Two of the chief challenges on the film were the increased use of ray tracing and the increased lighting sources, which not only came from outside sources, but the characters headlights. As for Ray tracing, the technique allowed shadowing under the cars and in the crevices to appear more realistic. Shadows on the ground faded as they got further away from the object. However, the biggest increase in rendering time came from radiance, which bleeds the color of an object onto a nearby surface. Ray tracing increases the rendering time because its calculating more and more points of impact and reflection of light. With 3,050 rendering processors working on the film, Cars was a constant challenge for the rendering team to keep on the track.
After racing with Cars, VES Festival guests raced, leapt and flew with the visual effects team on TVs Smallville. Excellent moderator Kymber Lim, exec producer at Entity FX, lead the artists, which included: Mat Beck, president of Entity FX; Ken Horton, exec producer; Eli Jarra, visual effects supervisor; and John Wash, on-set supervisor.
The biggest factor on a show with the amount of effects as Smallville (which included 750 effects shots in season five) is time. Episodes are shot over eight days in Vancouver, followed by two days of second unit work and one day for inserts. Afterward, the director comes to Los Angeles for four days of post, before an episode is handed over to the visual effects team.
Entity FX is part of the production process from script to delivery, which on some occasions has come on the day before the show is set to air. Beck and his crew look over the scripts and advise the production team of the realities of producing what the writers have concocted. Because of the tight time constraints, previs is only used on really complex shots.
A great deal of Entitys work is done on the season premiere and finale episodes, with the fifth season premiere requiring more than 100 shots. At certain times during the season, Entity can be in various stages of production on up to six episodes.
Treating the attendees, the artists broke down some of the more daunting effects they have created over the four seasons working on the series. Digital work is combined with practical action filmed on greenscreen sets. In many instances, practical effects are enhanced by digital effects to save time and money. For one particular scene, actor Tom Welling was filmed on a greenscreen stage climbing up the metal fuselage of a rocket, puncturing holes in the metal as he climbed. However, Welling has unable to puncture the holes without injury, thus needing the aid of a metal object in his hand. Those metal objects were painted green and digitally removed later.
One of the other concerns the visual effects team must tackle is how to make Clarks powers seem new on an episode-to-episode basis. For the most part, Clark will use super speed and go into Clark time (which is when Clark moves so fast that he seems to be moving at normal speed while the rest of the world is paused) at least once an episode. In an effort to keep Clarks powers feeling fresh, the production team and digital artists must come up with new and innovative ways to show off those abilities. To do so, the artists find new angles and perspectives and objects of detail to enhance the scenes.
Particularly difficult is Clark time, which often relies on actors on-set being able to stand completely still. Often green rigs are used to allow the actors to stay paused in unnatural positions, therefore selling the effect even more. As exec producer Horton says, he has the greatest job in the world he comes up with these great ideas in the shower and its up to Beck and his team to make it happen.
Moving from the small screen representation of the Man of Steel to the new big screen version in Superman Returns, visual effects supervisor Mark Stetson led a large panel that included: Scott E. Anderson, founder of Digital Sandbox; Christopher Bond, president/visual effects supervisor, Frantic Films; Joyce Cox, visual effects producer; Richard R. Hoover, senior visual effects supervisor, Sony Pictures Imageworks; John Monos, CG supervisor; Johnathan Rothbart, founder/visual effects supervisor, The Orphanage; Derek Spears, visual effects supervisor, Rhythm & Hues; Gavin Toomey, lead compositor, Framestore CFC, the only London company to work on the movie, and responsible for the climactic sequence on Lex Luthors island made of Kryptonian crystal.
For such a huge visual effects film with so many houses in the mix, a great deal of the panel was dedicated to how Anderson and Digital Sandbox were able to manage the assets of the entire production. Chief in their efforts was quality control. As Anderson described it, Digital Sandbox served as air traffic control for all the visual effects.
In charge of dailies, Digital Sandboxs job was to make sure that all visual effects from all the different houses were up to the standards set by director Bryan Singer and exec producer Thomas Tull. Before any shot was screened for Singer, they were color graded. With Digital Sandboxs facilities located on the Warner Bros. lot and access to all the projects data, quick changes could be made during screenings if needed.
One of the biggest challenges on the film was the use of the new Panavision Genesis cinematography camera. Because Superman Returns was the first major film to use the cameras, the production crew had limited data on how the cameras performed. Making things more difficult was that testing of the cameras came virtually on top of the start of filming. The filmmakers were impressed with the cameras range of 10 to 10-1/2 f-stops. However, the crystal clear images increased touch up work from around 100 shots on a typical film to 350.
Each of the supervisors from the various houses highlighted their work, including Frantics crystal growth and exploding sun (which was done in four weeks), Sonys flying and digital doubles, Orphanages bank heist work and Rhythm & Hues sea rescue and recreating Marlon Brando in the fortress of solitude. A highlight was seeing Stetson present the entire previs for the shuttle sequence. Cox said that previs was used extensively in the production not only to map out scenes, but for early screenings and to help Singer keep the studio interested in his vision of the film.
The festival concluded with a special sneak preview of Sonys Monster House. Visual effects supervisor Jay Redd introduced the film, which he is very proud of. And after the positive reaction to the film at the screening he should be. His closing point was to address a beef he had with coverage of computer generated animation. Redd stressed that computers are only the tools that the artist uses and that there are still human beings making these films. If any overall purpose could be taken from the VES Festival its just that to highlight that a lot of truly talented human beings create the magic we see on the screen.
Bill Desowitz is the editor of VFXWorld.
Sarah Baisley is the editor-in-chief of Animation World Network.
Rick DeMott is the managing editor of Animation World Network. In his free time, he works as an animation writer for television. His work on the new series Growing Up Creepie can be seen of Discovery Kids, starting Sept. 9, 2006. Previously, he held various production and management positions in the entertainment industry. He is a contributor to the book Animation Art as well as the humor, absurdist and surrealist short story website Unloosen.