Search form

The Holy Grail of Previs: Gaming Technology

Alain Bielik reports on the more ambitious vfx utilized in Underworld: Evolution, resulting in a pipeline shift by Luma Pictures and further reliance on ZBrush.

With the search for faster and more intuitive tools underway, previs practioners are turning to game engines. Weta Digial used a game engine for flying virtual planes in King Kong. © 2005 Universal Studios.

With the search for faster and more intuitive tools underway, previs practioners are turning to game engines. Weta Digial used a game engine for flying virtual planes in King Kong. © 2005 Universal Studios.

Previs has come a long way as a collaborative tool for directors and their production teams. However, with so much money at stake, the search continues for faster, more precise and more intuitive previs tools. Many believe that the holy grail of previs lies in the use of videogame production technology, especially that of the game engine, the core of a game that allows the movement and manipulation of modeled people and creatures within 3D sets, together with lighting and camera moves a process essentially the same as what previs supervisors do before a major movie starts production.

The benefit of being able to use a game engine is obvious instead of the complex scripting usually necessary to move characters around a virtual set to act out a proposed scene, previs teams could use game controls such as joysticks to move characters around, simply and in realtime. According to Carey Villegas, a vfx supervisor at Sony Pictures Imageworks (Bewitched), Using game engines and graphics hardware to create previs would allow for increased speed and performance, better interaction and higher visual quality and realism. Of course, realtime previs is the ultimate goal. At Imageworks, we are always looking into new and innovative processes and techniques. Whether it is to create more photoreal or life-like imagery or just make things faster and more efficient. The possible use of game engines to increase performance and take previs to a whole new visual level is no exception.

According to Sony Imageworks Carey Villegas, using game engines and graphics hardware to create previs allows for increased speed and performance, better interaction and higher visual quality and realism.

According to Sony Imageworks Carey Villegas, using game engines and graphics hardware to create previs allows for increased speed and performance, better interaction and higher visual quality and realism.

When I interviewed previs experts about gaming engines a year ago, the general consensus was: Definitely not ready for prime time. Game technology wasnt considered fast enough or precise enough. Whereas the representations of sets, actors and movements that previs teams come up with dont have to be perfect in every detail (the models of the actors, for instance, can be quite rough, corresponding to the animatics created before an animated film), certain details such as camera position have to be dead-on together with the characteristics of the camera, the lenses used, even zoom angles and depth of field. Game producers in the past have never had to worry about this level of detail many game characters consisted of a few hundred polygons and consequently, this type of accuracy has been totally missing in game engines. Daniel Gregoire, previs supervisor for War of the Worlds and Star Wars: Episode II and III, now suggests that game engines aren't flexible enough for on-set wholesale changes and that Maya is not fast enough to run realtime.

Another problem is that of compatibility. The biggest challenge with game engines is that they arent currently compatible with the 3D software that most effects houses use, and their integration with these products requires extensive software and engineering development, adds Villegas.

Kevin Baillie, a vfx supervisor at The Orphanage, who has worked on Harry Potter and the Goblet of Fire, Cursed, Hellboy, Sin City and Spy Kids 3-D, agrees that data incompatibilities can be deadly. We use a flexible pipeline between Maya and our other finishing tools, 3ds Max (color and lighting) and Houdini (hard core particle fx work), to assure that every last bit of work done in layout transfers directly to the shots. Were extremely careful to use real-world cameras, build our scenes to an accurate and consistent scale and assure all digital assets have identical counterparts in all necessary applications it takes a lot of diligence to ensure that level of reusability, but in the end it pays off in spades!

Kevin Baillie of The Orphanage finds that data incompatibilities can be deadly. His company is careful to use real-world cameras, build scenes to accurate scale and assure digital assets have exact counterparts in all applications.

Kevin Baillie of The Orphanage finds that data incompatibilities can be deadly. His company is careful to use real-world cameras, build scenes to accurate scale and assure digital assets have exact counterparts in all applications.

Up and Coming Developments

The difficult proposition of using game technology for previs may be about to change, however, for several reasons. One is that games have gotten much more detailed, with the graphics engines in new consoles such as the PlayStation 3 or the Xbox 360 being able to delineate individual faces for characters and highly detailed lighting and background details. Modern versions of prominent gaming engines such as Unreal 3, Doom 3, Half-Life 2, Far Cry, Gamebryo and Titan 2.0 have gone through huge improvements to keep up. These game engines now not only have to support the characters and sets of the actual game, but also are increasingly used to create the in-game cinematics (also called cut scenes) the short stories that are used as introductions or between-scenes transitions. Cinematics used to be very short (because of storage space) and crude, but have evolved into highly detailed episodes as long as five to 10 minutes in effect, they have become short movies within the game itself.

A second reason to take the advent of gaming technology for previs seriously is because many experts are pointing in this direction. Game engines will become a critical part of previs, according to Scott Ross, the ceo of Digital Domain, who sees future filmmakers sitting at gaming consoles, making choices for characters and scenes and lighting and movements. For instance, Digital Domain used a videogame (a flight simulator) and gaming interface for the movie Stealth, which involves a lot of aerial combat scenes similar to the movie Top Gun. Digital Domain was able to create footage of proposed flight scenes in Stealth for director Rob Cohen this way in previs, before the scenes were committed to and rendered in full resolution.

Digital Domain used a videogame and gaming interface to create previs of proposed flight scenes for director Rob Cohens Stealth. © Sony Pictures. Courtesy of Digital Domain.

Digital Domain used a videogame and gaming interface to create previs of proposed flight scenes for director Rob Cohens Stealth. © Sony Pictures. Courtesy of Digital Domain.

Loni Peristere, co-founder of Zoic Studios and vfx supervisor of both films (Serenity, Zathura, Pathfinder) and TV shows (CSI, Angel, Firefly, Buffy the Vampire Slayer, Battlestar Galactica), suggests, I cant wait to incorporate a game engine into our pipeline, as a tool for greater efficiency and designing shots. It would be great for getting the dynamics of a scene telling the game you have an exploding building, for instance, and then being able to plan camera moves around that. Zoic employs previs extensively for entire scenes, including camera motion control, using both Maya and LightWave. Peristere notes, The engines will have to incorporate more cinematic features, such as the limitations of a dolly track or crane. However, using controls such as those for a PlayStation can really help with camera moves in free space. Weta Digital was able to use a game engine for flying virtual planes around the Empire State Building in King Kong, and was able to record and then recreate the camera moves very effectively. Peristere believes that use of game engines such as the Unreal engine (which at present seems to be the engine of choice for previs) will be part of a growing trend of moving previs more and more into the hands of the director. Whereas prices have to come down, and data commonality has to be increased, the use of game engines is only a matter of time, he offers.

A third reason is that games and movies are becoming closer to each other, in many ways. Movies are increasingly loaded with game-like effects. Games are increasingly based on movies. Videogame-based revenues were actually higher last year than those of in-theater movies. Why not create the two at the same time, with some of the same tools? Up till now, games and films have been produced distinctly apart (with a very few exceptions, such as The Matrix trilogy). Much of the problem has been cultural as well as technological. Efforts at creating common file sharing formats between games and film vfx toolsets look like they will soon be successful the technology can eventually be resolved. But the suits of Hollywood have never really gotten along well with the much looser culture of game producers. Efforts to bring game production in-house at major studios have often been spectacular failures. As a result, videogames based on movies are generally almost afterthoughts for the Hollywood moguls the games are started well after the film design is complete, and are then given insanely short amounts of time to get completed often as little as six to eight months (in order to appear on shelves in time with the films release) instead of the 18 months of production time that a videogame typically needs.

By Episode III, George Lucas tried out scenes in previs on a daily basis. He did 23 revisions of the first minute of Sith. Photo credit: Sarah Baisley (left) © &  Lucasfilm Ltd. All rights reserved. Digital work by ILM.

By Episode III, George Lucas tried out scenes in previs on a daily basis. He did 23 revisions of the first minute of Sith. Photo credit: Sarah Baisley (left) © & Lucasfilm Ltd. All rights reserved. Digital work by ILM.

This is a great pity, because so many of the new game releases are branded increasingly, games are tied in with movies or TV shows. If a feature and its attendant game could really be partners in the production process, it could result in a vastly better game and bottom line for such a joint project.

Enter George Lucas, who has major credentials on both the gaming and filmmaking sides of the fence. His Star Wars series of games have been best sellers and have consistently been ranked in the top 10 games within the years of their release. His films have been among the most popular of all time, and he has pioneered many vfx techniques including those of previs. Lucas started long ago to realize that ordinary storyboard techniques were insufficient to get his ideas across to his pre-production team or to help keep the hundreds of creatures, characters and environments organized and moving down a timely pipeline. For the first Star Wars film in 1977, he cut together World War II footage of fighter planes dogfighting, as a moving storyboard for the attack on the Death Star. That approach evolved into using miniatures of the snow speeders, as well as hand-drawn animations, for The Empire Strikes Back. By 1994, for the beginning of the second trilogy, Lucas hired vfx artist David Dozoretz and a team that used 3D animation toolsets to create rough film shots, similar to animatics, that could be used both to guide the production teams on location and the post-production teams adding virtual creatures and scene elements. For Episode III, the previs was handled by a dozen artists under Gregoires supervision, using 64-bit AMD Opteron-powered computers running Maya and Adobe After Effects to create an immersive environment. Lucas would come to try out scenes on a daily basis, pre-shooting scenes to his hearts content. He did 23 revisions of the first minute, at a small fraction of what it would have cost to have it done in full post-production. Lucas has stated that the use of previs trimmed at least $10 million off the films budget.

Given Lucas vision and vast resources at the new Letterman Digital Arts Center in San Francisco, its not surprising that ILM is developing previs with the LucasArts game engine and testing it on one of their upcoming releases. What were saying is, Lets make this like photography; do it in realtime, suggests Lucasfilm cto Cliff Plumer. This is something weve been developing in conjunction with LucasArts to hand the previs to the director. Its almost like a game. The director can plan how to shoot a live-action or block a CG scene. Contained in the application are libraries of lenses and so forth. But, we can also record the camera moves, create basic animations and block in camera angles. And instead of handing rendered animatics to the CG pipeline, we have actual files camera files, scene layout files, actual assets that can feed into the pipeline. It gives the crew input into what the director is thinking.

Red vs. Blue is one of the most popular series of machinima shows. © Rooster Teeth.

Red vs. Blue is one of the most popular series of machinima shows. © Rooster Teeth.

Game-Based Previs for the Rest of Us

Even though much of the hot development in previs technology may be going on behind closed doors at high-end vfx shops such as ILM, the Pixel Liberation Front, The Orphanage and Zoic Studios, there is also a lot in the way of previs tools and methods available for low-budget filmmakers and even beginning students. A recent development being used for previs is machinima, the use of game engines to create story sequences. Although the original purpose of machinima (mah-shee-nee-mah), the marriage of machine and cinema, was to produce finalized filmic stories rather than previsualizations of larger projects, the process is basically the same create sets and characters, move characters around in a certain way, try a different approach, repeat until satisfied. Game engines for machinima productions can be very low priced, and a few are even shareware. The director of a machinima project can either use a game interface to move the characters around, or use the simple scripting of the game engine to plan character moves. Lip sync editors are available to enable the characters to speak in response to inputting typed text. Although the output of machinima may not be up to Lucas standards, there is a startling number of creative and even breath-taking short movies that are being shown in machinima festivals around the world. The Academy of Machinima is a great resource for information on such festivals. One of the most popular series of machinima shows has been Red vs. Blue, from Rooster Teeth Prods.

The Art of Machinima, by Paul Marino, is another great resource, and instructs the fledgling previs student or filmmaker not only in how to use a game machine to set up story sequences, but actually includes a CD-ROM with a game engine to try out projects in real life. Machinima, by Dave Morris, Matt Kelland, and Dave Lloyd, is another great introduction to this new cinematic art form that can be used for previs. It is also possible to simply play a game and capture the motion of the characters. The Sims 2, from Electronic Arts, has a simple interface just hold down the V key to capture video footage from game sequences. The virtual environment (though not technically a game) of Second Life has a similar in-world tool to capture video clips that can then be edited. One popular tool for video capture of game sequences is FRAPS, an inexpensive way to generate .AVIs to edit for final productions.

The Sims 2 has a simple interface to capture video footage from game sequences. © 2005 Electronic Arts Inc. All rights reserved.

The Sims 2 has a simple interface to capture video footage from game sequences. © 2005 Electronic Arts Inc. All rights reserved.

And for the real novice, there is a hugely entertaining game out from Lionhead called The Movies that lets aspiring filmmakers learn many of the fundamentals of previs, including set design, costuming, character movement, camera placement and story development.

Summary

The use of game engines for previs has been very limited up to now. Previs experts have warned with good reason of the unacceptable resolution levels of existing game technology, as well as data incompatibility and high prices (for a single project) for some popular gaming engines. Nevertheless, the boundaries between high-end 3D feature vfx and game tools continue to get fuzzier, and the economic drive to merge movies and gaming into common pipelines may be unstoppable. We are interested in the use of hardware for animation in previs and then production, offers Imageworks Villegas. Eventually, we hope the physics engines will allow a wide range of objects to move and interact with the same sense of mass and motion as they do in real life allowing non-character driven animation to be more procedural.

Finally, prominent pioneers of previs and fans of videogames such as Lucas and Peter Jackson seem determined to meld the two, making the emergence of common film and game toolsets almost a certainty. Easy and low-cost learning tools are available for students, who can ready themselves when these new toolsets emerge in the near future.

Christopher Harz is an executive consultant for new media. He has produced videogames for films such as Spawn, The Fifth Element, Titanic and Lost in Space. As Perceptronics svp of program development, Harz helped build the first massively multiplayer online game worlds, including the $240 million 3-D SIMNET. He worked on C3I, combat robots and war gaming at the RAND Corp., the military think tank.

Tags