Tulay Tetiker presents an overview of the many companies competing in the growing German visual effects industry.
Next to the French-Italian border and nestled between the Mediterranean and the mountains, Monaco has played host to the Imagina Festival for more than 20 years, making it one of the longest-lived 3D events in the world. Originally initiated by French governmental agency INA (Audiovisual National Institute), Imagina has been a place for artists, technicians, researchers, developers and manufacturers to gather and exchange their knowledge and share their vision on emerging technologies and the art it enables them to produce. A few years ago, INA disengaged itself from the event due to budget considerations, and the Festival skipped a year before being picked up by various entities, including the Monaco principality.
Imagina 2005, which took place Feb. 2-5, was loosely divided into five general categories: animation, post-production, games, architecture and mobile technologies. However, three trends emerged from the conference that are probably of most interest to the VFXWorld reader: virtual character creation, massive environment and crowd reconstruction and the ongoing rise of image-base technologies.
Sony Pictures Imageworks was at Imagina for the first time this year, and not only were Spider-Man 2 s vfx discussed in three different presentations, but they also embodied the three trends cited above: in the train sequence, the streets of New York were recreated in highly detailed 3D models and textures, and ultra realistic characters were conceived using a fully image-based approach.
The Virtual Actors session provided a good opportunity for the audience to compare the work done on Matrix, Lord of the Rings and Spider-Man 2. Once again, the making of Neo, Agent Smith and Gollum were discussed in details, as well as newcomer Doc Ock. Paul Debevec, executive producer of graphics research at the University of Southern California's Institute for Creative Technologies, explained the principles behind his light stage technology for capturing illumination data of the human face, a technique used to recreate the full CG Doc Ock. It was the first time that this technology was used in feature production, a gamble that paid off handsomely for Sony.
A separate presentation by Framestore CFCs CG supervisor David Lomax centered on the creation, animation and rendering of Harry Potter and the Prisoner of Azkabans Hyppogriff. After the rendition of fur, the feathers were another barrier to realism and Framestore's 3D artists were determined to cross it. In his presentation, Lomax explained how every single feather along with every single barb and barbule of each feather were modeled and rendered. It was a huge undertaking and the final result is brilliant, but could it have been done more simply? The debate was raging in the corridors after the conference, particularly since this presentation was followed by Buf Compagnies who, among other things, showcased an ultra realistic 3D eagle whose feathers seemed just as believable, despite much lighter modeling and overall work
King Arthur, Troy and Alexander exemplified massive environments and crowd reconstruction. For King Arthur, nearly every outdoor shot had to be touched up by Cinesite (Europe) as the movie was supposed to take place amongst snow-covered mountains in winter but was instead shot in emerald-green Ireland in the middle of summer. Even if on-set vfx helped to create a snowy atmosphere, many backgrounds had to be re-rendered using matte paintings and 3D effects.
The most impressive work done by Cinesite in terms of landscape reconstruction was the battle sequence on a frozen lake, originally shot on a green prairie. As Cinesite vfx supervisor Matt Johnson reiterated, Variety s commentary on the film was along the lines of: "At last, a summer blockbuster with no special effects" Cinesite did such a good job that other vfx supervisors attending the conference admitted they wouldn't have guessed the film had been digitally altered so heavily.
In terms of battle sequences, Johnson explained that work on the film began too early to be able to use a commercial application like Massive to create crowd sequences. Cinesite had to develop its own software to drive the armies into battle using artificial intelligence. The Moving Picture Co., which recreated for Troy large-scale battles, also created their own software to do so. MPC additionally created entire cities for Troy, and, interestingly enough, while massive sets had been built for the film, many ended up recreated in 3D for various reasons.
Of all the films presented on massive environments, the work done by Buf on Oliver Stone's Alexander was perhaps the most impressive. Since Stone had chosen to shoot pretty much free style, with handheld cameras and much improvisation, Buf had to deal with shots that had not been planned to be vfx shots.
Buf also had to put large 3D armies on shaky backgrounds filled with dust clouds. Since the hectic pace of filming in Morocco did not lend itself to setting up a motion capture facility, Buf chose to go for a simple "video motion capture" system consisting of four small video cameras filming stuntmen fighting, running and doing all kinds of battle moves. Horses and war chariots were also captured. Everything was then rotoscoped to produce 3D animation files, which were used in lieu of standard MoCap recordings. This basic system turned out to be really efficient and was used to produce all the animation for Alexander s armies.
In line with its desire for simple practicality, the Buf team also decided not to use an artificial intelligence-driven crowd system. Instead, it used a series of simple scripts to drive battalions of soldiers on the 3D battlefield, a solution that proved very flexible as it allowed for the hands-on direction of every army move. Combined with the energetic and documentary-like cinematography, it makes for extremely realistic and immersive battle sequences.
But Imageworks also had something to say about massive environments. For the first Spider-Man, Sony had opted for a simplistic city reconstruction based on low-res geometry and highly detailed textures. This time, the team decided to go for ultra-detailed high-resolution modeling of each building teamed with impressive texturing. For example, skyscrapers' offices were textured with different environments and rigged with several illumination systems that could be controlled to produce various daylight and nighttime effects. Though not technically groundbreaking, the amount of work and care put into this reconstruction was quite impressive.
Bluescreen Feature Films
The rendition of highly realistic environments and sets is no longer a problem today, and as it opens new avenues to filmmaking, it was surprising not to see an Imagina session dedicated to the subject of "bluescreen filmmaking." It would have been interesting to compare the varying experiences of working on The Polar Express and Sky Captain and the World of Tomorrow (two films represented at Imagina) with the latest works of director Robert Rodriguez (Sin City and The Adventures of Shark Boy & Lava Girl in 3-D) and the French film Immortel, all films having been shot nearly exclusively on bluescreen, in order to understand the stakes this kind of filmmaking represents, the new opportunities it affords and its limitations today.
However, vfx supervisor Darin Hollings uncovered the tip of the iceberg in his presentation of the making of Sky Captain during the HD and vfx conference (hosted by Guerville, the co-author of this article). A completely digital workflow along with extensive preparation and previs allowed Hollings to provide his crew members every morning with a call sheet detailing each shot complete with camera rig and lighting placement. This enabled the entire crew to work extremely autonomously and effectively, achieving an incredible average rate of filming of one shot every 12 minutes. The grid system installed on the set and matched by a grid in the 3D space of a virtual environment enabled the team to synchronize perfectly real and virtual elements on the set with on-the-fly compositing.
Once again this year, the technological and research presentations included many image-based approaches. In his Vision for Animators conference, Steve Seitz, associate professor at the University of Washington, discussed the current limitations of standard capture tools that are often invasive, labor intensive and can only reproduce reality. He showed how, by extracting movement, texture or geometric data from video footage, it was possible to use live elements to recreate ultra-realistic animated sequences simply and effectively. For example, by analyzing the optical flow of water particles in a very short clip of a waterfall, Seitzs model could make the waterfall loop endlessly and, simply by drawing a line representing a new course for the waterfall, it could recreate a new waterfall just as realistic as the original. From then on, one can "paint waterfalls" any which way, without even being limited to water particles; fire, smoke or other animated elements can also be used.
Seitz also presented the mapping of flowing video elements on a 3D particle stream. He used a video-clip of water pouring from a garden hose mapped on a Maya generated particle flow to animate the water at will, without the need of any kind of texturing, lighting or rendering. Seitz covered two other projects, one based on animated rotoscoping and the other on video-based facial reconstruction and animation.
At the other end of the research spectrum, Ron Fedkiw, assistant professor at Sanford University, presented ultra-realistic heavy-duty simulations of natural phenomena. With his approach, 3D objects are no longer hollow envelopes but instead behave like solid objects that can be squashed, deformed, melted or broken, just as real world objects would be. In addition to impressive water and smoke simulations that can take into account complex parameters of vorticity, thus recreating rich and subtle turbulences, Fedkiw also presented dynamic simulations of rigid and soft bodies combined together; for example, solid objects floating or sinking in a tank of digital water (and even scooping water from it), a water spout that deformed cloth, as well as cloth burning, ice cubes melting, etc. Every single movement and deformation was completely dynamic and animated by physics, not by manual control. A very impressive demonstration.
Imagina Awards, Software, Style and the Future
As is usually the case at Imagina, student films were largely featured in the Imagina competition and competed on equal footing with studio productions by generally displaying a high level of artistic inspiration. The complete palmares can be found at www.imagina.mc.
Does that mean that 3D film and vfx production will continue to expand beyond dedicated studios? Due to timing and hiring considerations, Shark Tale was produced with virtually no proprietary software, with Doug Cooper (vfx supervisor at DreamWorks) sharing his belief that commercial 3D packages were now sufficiently advanced to enable the full production of CG feature films.
Besides, is realism always the way to go? Animation wise, it was interesting to note that two of the major presentations, Shark Tale and The Incredibles, marked a deliberate return to a more cartoonish style: strong silhouettes, broad expressions, distorted in-betweens Seeing the hand-drawn animatics of The Incredibles was compelling proof of the power of 2D animation: 3D animation still has much to learn from it.
In the end, as Imagina Festivals go, 2005 was probably not one of the best years. That leaves room for improvement in 2006, perhaps with more focus on stronger thematics, a wider variety of European projects as well as increased opportunities for informal discussion between participants. Nevertheless, Imagina remains a necessary event for the industry: a unique meeting and exchange of European, American and Asian influences and approaches with top-notch speakers and participants.
Mireille Frenette and Benoit Guerville have been working together for 10 years. They have a production company for which Mireille produces and Benoit directs. They also contribute to various publications in France, the U.K. and the U.S., including Computer Arts, 3DWorld and French trade magazine Sonovision. Mireille was born in Montreal, Quebec, and Benoit in Paris, France. They now live in the south of France.