Janet Hetherington talks to vfx experts about how videogame technologies are influencing movies and TV -- in production, presentation and marketing.
Call it the not-so-little engine that could.
The engine in question is the Unreal Engine, developed by Epic Games, one of various game engines being licensed out to videogame producers. But when Ed Ulbrich, president, commercial division, and evp of Digital Domain saw Unreal Engine 3 in action in Epic's own Gears of War videogame, he also saw the future.
"Convergence is something that we've been talking about for some time, but with Gears of War it came together," Ulbrich contends. Digital Domain worked on the E3 presentation and the television commercial for Gears of War, and Ulbrich says that the presentation and commercial used "realtime, on engine" game assets. In fact, the commercial was rendered entirely through the game engine on Xbox 360.
The results, Ulbrich says, look better than out-of-game cinematics. "In the past," he adds, "pretty cinematics tended to over-embellish the product. That level of quality set up expectations that the game couldn't meet. For Gears of War, this is not the case." The Gears of War advertising provides gamers with an accurate depiction of game visuals and game play, because actual game assets have been utilized.
While Ulbrich admits that Gears of War represents the early days of convergence, he notes that "very robust videogame engines could be used to produce parts of movies and commercials as well as games."
That newfound versatility is attracting established film and TV directors to the videogame scene. "Game engines are striving to replicate the real world and the fantasy world as realistically and as believably as possible," comments Gary Roberts, vp, Vicon House of Moves/Vicon Feature Unit, whose company also worked on the Gears of War commercial. "To do this, the game engines often find ingenious and very creative ways of cheating to create atmospheric and visual clues that fool the viewer and player into a false sense of reality, and to suspend their disbelief. This technology has many feature film directors excited, since it can be used for previsualization and for visualizing a feature film shot, which may be completely live-action in the end, using game engine technology."
But it's not only engine power that's drawing the interest of directors such as Peter Jackson and Steven Spielberg. Developments in other areas, such as motion capture -- technologies that are equally useful on a movie set -- are also showcased in videogames.
"We are now seeing many game developers hiring directors from film, broadcast and theater to help them direct their story within their cinematic and game sequences," Roberts says. "These -- along with feature film, commercials and TV clients -- are really pushing the envelope, and we have been developing more and more tools and technology to bring the artistry of filmmaking to the motion capture set."
Roberts adds that previs is an aspect of motion capture that is being used more and more. "Often, shoot days for feature film and some games are expensive due to sound stage requirements, audio and video crews and, of course, big-name talent," advises Roberts. "It is important to be as prepared as possible for what is going to be shot. Typically, scripts and storyboards are developed to aid in this process. Now we are able to use motion capture to provide an entire 3D previsualization of shots quickly and efficiently. Simple body motions and virtual camera tracking is used on set to allow directors and the 'creatives' to essentially shoot the scenes in simple 3D to provide blocking and continuity. This also allows agencies and the creative directors to sign off on shots prior to the actual film or game shoot itself."
Roberts explains that virtual camera tracking (allowing the director and DP to frame their shots in the virtual world in real time), real-time previz of characters and sets during actual capture, immediate playback for captured scenes and direct feeds into editing suites to the director can begin to cut scenes together on set, are all new tools that many feature film and game directors find especially exciting.
"We are also advancing our tool sets in post-production to be able to process the large data sets now being captured (250 markers per actor) in a more efficient and automated manner," he says. "This allows us to reduce costs and man power in post-production, and more importantly pass on these costs savings to our clients."
Vicon House of Moves has brought its motion capture expertise to such videogames as Tom Clancy's Splinter Cell Double Agent, Rise of Nations: Rise of Legends, Tiger Woods PGA Golf 2007 (in association with Electronic Arts and its motion capture team) and Guitar Hero II, by Harmonix (all motion captured for CG characters, band members, singers and the crowd). "For this project, we engineered and developed a custom beat box that derived the beat of any music track, and captured this as sequence of lights flashing and delivered as a 3D element to ensure all motions are synchronized within the game engine," Roberts says.
"Now that House of Moves is the production arm of Vicon, we also have the capability to capture the type of full-performance content that was used in Monster House," Roberts continues. "Many game companies are demanding this type of capture and animation service for upcoming next-generation consoles, such as the PS3 and Xbox 360. The power of these consoles is allowing games to display multiple high-resolution characters, which are more believable then ever."
In November 2006, Vicon announced the sale and installation of a 48-camera Vicon MX motion capture system to Activision Inc. The system, which provides Activision, its developers and co-developers with dedicated in-house motion capture capabilities, has already been used to complete major Activision titles, including Call of Duty -- with the Activision team more than doubling its capacity for producing character animations for cinematics and game play.
"One of the biggest examples of how the gaming industry has driven 3D technology such as motion capture is with economy and efficiencies," Roberts suggests. "Most games these days require more motion capture and 3D animation than any film. This has driven our technology and pipelines, not to mention service, to be as efficient and cost-effective as possible. This in turn has enabled productions like Monster House, Barnyard and Happy Feet to take place within reasonable budgets. Without this, we would not have been able to get through millions of seconds of motion in a year for all our clients."
Chicken or Egg
For videogame producers, necessity is often the mother of invention. However, sometimes the technology already exists and finds the need, and other times the need demands the creation of the technology.
"Creativity can come from anywhere," notes Rudy Poat, creative director, EA Vancouver. "It could be triggered by a new tech idea, a sound, an image, etc. I find that the best ideas seem to be completely accidental and random in their initial finding. Happy accidents that are formed when connecting boxes that shouldn't be connected."
Industrial Light & Magic (ILM) developed its portable, versatile Imocap motion capture system and software to bring soundstage quality to the motion picture set. ILM compositing supervisor Eddie Pasquarello explained to attendees at the ADAPT 06 conference in Montreal that Imocap was developed to solve problems during the shooting of Pirates of the Caribbean: Dead Man's Chest. The flexible system allows actors to wear specially designed caps and suits for tracking movement, right on set. "With Imocap, there are no camera or lighting restraints," Pasquarello says.
For the Pirates sequel, actor Bill Nighy, who played Davy Jones, acted freely on set wearing the Imocap technology. Once the motion was captured during filming, digital characters such as Jones could be built digitally, with the aim to garner the most realistic effect. Because of the computer-generated character's photorealism, many of the movie's reviewers mistakenly identified Nighy as wearing prosthetic make-up.
And then there's universal capture, or UCap -- a new facial imaging technology that combines head scanning, motion capture and video capture to deliver truer-to-life characters for videogame. Electronic Arts used UCap in Tiger Woods PGA Tour 07 to reflect Woods' smiling, scowling and other "game faces."
House of Moves' Roberts agrees that motion capture with facial animation is a hot topic in 3D animation. "More precisely," Roberts says, "the technique that describes how to apply motion capture data that has been captured to a 3D character face that may be different to that of the actor -- oh, and to make it believable as a character. This is no easy task, especially to make it as automated as possible, yet still give animators the ability to tweak each shot to get the final performance of the CG character that they need."
Roberts says that Vicon House of Moves is developing such technology, which is already looking promising and is being used on several game productions.
"In essence, there are two sides to encouraging the growth of creative breakthroughs, that defined by productions in the real world, and that defined outside of real productions -- just plain good ideas," Roberts adds.
"Since House of Moves is now part of Vicon, which in turn is owned by Oxford Metrics Group (OMG) who also owns 2D3 and Peak Performance Technologies, we have no end of creative brainiacs in the company that are all coming up with ideas and solutions to problems or demands found in production," Roberts says.
"House of Moves is a great test bed for all of Vicon technology and 2D3 technology," Roberts continues. "2D3 develops high-end camera tracking software which is used in practically every feature film made. At House of Moves, we often identify the latest production capability demands, and we solve them as quickly as we can, given specific productions and time frames. Once these technologies are proved, we turn them into actual products for Vicon systems and software."
Roberts notes that House of Moves also has a working environment that nurtures creative ideas and creative application of technology. "We have a lot of technology within the company and partner companies. Our code base is always open architecture which allows us to share code and hardware between our groups and companies," he says.
When Sony Imageworks wanted to capture an entire film using motion capture, but still give all actors and the director complete freedom, Sony commissioned Vicon to develop the world's first truly scalable motion capture system. That system is now being used in many productions, the latest of which is Robert Zemeckis' Beowulf. "We also developed custom hardware and software to meet the technical and creative needs of the production wrapped around their projects," Roberts says. "This type of work ultimately benefits all of our clients and users.
Another amazing development in videogames is for the game play to be "intuitive" -- to produce non-scripted, non-programmed results during game play. At the ADAPT 06 conference, Chris Williams, project lead, LucasArts, gave a striking presentation on the use of the euphoria technology -- utilizing NaturalMotion's DMS system -- in its Indiana Jones videogame.
"Dynamic Motion Synthesis, or DMS, is our technology to procedurally generate character animation in realtime," explains Torsten Reil, ceo of NaturalMotion. "It is based on biology and robot control theory. DMS simulates the 3D character's body, muscles and -- most importantly -- its motor nervous system. As a result, all DMS characters are fully interactive and never show the same animation twice."
"euphoria uses DMS in realtime on PS3, Xbox 360 and PC," says Reil. "That is, the CPU (such as the Cell chip) simulates the 3D character while the game is running. This means that every time you play the game it looks different. No fighting scene, no football tackle is ever going to be the same. This, of course, is very different from the old approach of just playing back canned animations.
"We have also found that players empathize a lot more with euphoria characters than with canned animation," Reil comments. "When LucasArts showed its Star Wars demo behind closed doors at E3, people actually felt sorry for the Stormtroopers desperately clinging on to each other and the environment!"
Another of NaturalMotion's innovations is a technology called morpheme. "morpheme is not based on DMS," Reil notes. "Instead, it's a contemporary animation engine that is backed up by a powerful tool (called morpheme:connect). We developed morpheme to give animators creative control over the look of their animations in game (such as the blending, transitions, compression, etc.). Usually, this last bit of the pipeline is controlled by programmers, but animators have told us they really want to do this themselves -- ideally in a graphical user interface. So that's why we built morpheme. The upshot is that in-game animations look a lot better, and meet the standards that the source animation clips set."
The development of DMS and morpheme did not come overnight. "With DMS, we tried to look five years ahead to meet the anticipated need for believable interactive characters, and have sufficient computing power," Reil says. "With morpheme, on the other hand, it was much more about talking to a lot of animators and developers, and understanding how they want to work, and what's annoying them about the old ways.
"With regard to funding, this comes from revenues from our products, as well as private equity (from Benchmark Capital, the venture capitalist behind eBay)," Reil advises.
While NaturalMotion's products are primarily aimed at the videogame industry, Reil sees the potential of its applications in film and TV. "Our goal with our runtime technologies is to achieve post-production quality, at realtime speed. As such, both euphoria and morpheme's tech are likely to find their way into vfx in one way or another," he says. "For example, by combining DMS with morpheme's authoring capabilities, you could imagine a vfx tool that allows directors to create scenes at previs speed, but at such high quality that the data can be carried all the way through the pipeline to post-production."
Over at EA Vancouver, Poat says that he is working on the next-generation future projects. "I'm part of a small highly creative team putting together new game ideas. It will be very rich, story-driven content married with innovative game play." Artificial intelligence is one area being explored. "The same logic that drives a lot of game tech drives the initial system," Poat adds. "The standard simple AI, like path-finding and state machine logic... we've been talking with partners to expand the logic, adding neural net technology, etc.
"On the artistic end, John Gaeta [and I] have been developing a new interactive experience, taking our realtime project a step further than the material we created for Trapped Ashes." Hopefully pushing the artistic boundaries of what can be visually stimulating and completely immersive.
Poat and Gaeta both worked on The Matrix, and Gaeta won an Academy Award for his effects, including "Bullet Time." Trapped Ashes is the duo's first experiment with interactive realtime cinema; they built a way to deliver frames in HD that were ready for film the minute that they come out of the box, in realtime, with compositing done in-engine.
The shots also run on a server, so, on a network, a cameraman could log in and film in realtime, while another person could log in as a lighter and move the lighting around while the cameraman takes pictures. In effect, several people can log in and work on the film simultaneously.
"We want to create a shared experience, where there's pure collaboration and artistic expression," Poat offers. "The Trapped Ashes stuff was the very beginning, a baby step, where artists can converge and create a visual shared experience."
Today's videogames also benefit from advanced delivery systems. "PS3 and Xbox 360 are the first generation of platforms that have enough power for procedural content generation, whether it is motion, textures, sound etc., so content doesn't have to be canned but can be created on the fly," comments NaturalMotion's Reil. "Secondly, online distribution is going to allow more innovative developers to take risks without betting the farm. I think we'll see a lot of new ideas in the next few years."
This November, one year after its launch, Microsoft's Xbox 360 has begun offering an initial line-up of TV shows and movies to gamers in the U.S. via Xbox Live, Microsoft's online games and entertainment network. Gamers will have access to hundreds of full-length TV shows for download to own, and movies for download to rent, from CBS, MTV Networks, Paramount Pictures, Turner Broadcasting System Inc., Ultimate Fighting Championship and Warner Bros. Home Ent., with more content rolled out through Xbox Live Marketplace every week. Xbox 360 is the first gaming console to offer standard and high-definition TV shows and movies via digital distribution.
November 2006 also saw the release of both Nintendo's new Wii videogame system (which, in addition to innovative game play, promises a gateway for players and non-players alike through a collection of interactive channels), and Sony's PlayStation 3. Videogame developers must constantly keep abreast of new platforms to make best use of their attributes.
"I see both hardware and software complementing each other to meet expectations," says House of Moves' Roberts. "Not only do new game consoles provide a platform that can deliver amazing visuals and sound treats, hardware is becoming more important for controllers and ways of interacting with games. We are already considering using game engines for visualizing character animations and environments in real time as we capture on set.
"Projects are becoming more emotionally driven. Storytelling, visual fidelity, atmospheric audio and great animation from motion capture are creating more emotive pieces for players to get involved in. Drawing the player into the characters and story enables you as a game designer to being to control the emotion of a player -- something that films have been doing for decades. It's tough to do with games, because the player is physically and mentally interacting.
"Interestingly, it is this interaction that is being embraced to enable a higher draw into a game -- interactive technologies such as the eyetoy, and the Wii controller are good examples. Motion capture will be mainstream one day; it already is in a sense with the eyetoy, but it will become more so in the next few years, and games strive to immerse the player completely. We are seeing plot and storyline becoming as important to the game developers as the actual game play itself. The advent of game developers hiring directors and dps is a good sign of the future ambitions that game developers and game players have."
Videogames are often used to support new film releases and franchises, but they are also a recognizable force in their own right. "The game industry is worth about $20 billion globally (not counting hardware sales), and keeps growing very fast," says Reil. "A typical AAA next-gen production is around $10 million, but this can go up to $20 million or even $30 million."
As videogame technology sees convergence with film and TV, so does related marketing strategy. In the past, there was a lapse between the release of a film and any related videogame. Now, with convergence, products can be marketed concurrently.
"Imagine you could do it all at once," muses Digital Domain's Ulbrich. "You could have the theatrical release, the DVD and the game come out separately -- or all at the same time. There are choices in the way that things can be marketed."
Ulbrich contends that convergence -- using the same common digital assets or engine -- means that a game and a movie can look substantially the same, and the products could even be marketed before the game or movie was finished.
Ulbrich says that Digital Domain is already using the convergence lessons learned with Gears of War. "This is a fundamental business strategy at Digital Domain," he remarks. "We're already looking at using videogame engines for creating movies and games simultaneously."
Other experts agree that videogames are no longer the poor cousins to the powerhouse media of television and film. "In the past, videogames based on film intellectual property were rarely good, but today that has changed," says Reil. "Equally, game and film technology are growing closer as the former catches up with the fidelity requirements of the latter. And lastly, you now get game-based films (like Tomb Raider), or even film styles borrowing from games (like The Fast and The Furious)."
"Both videogames and film are providing mediums to entertain. Both are beginning to have similar production values," comments House of Moves' Roberts. "Games are beginning to benefit from the production mentality and efficiencies found in the film industry and vice versa. Games are benefiting from storytelling tools and practices found in traditional filmmaking.
"I cannot imagine what the drawbacks would be with such a convergence. I used to work in the games industry back in the '90s with Electronic Arts. I remember a similar question coming from the press at the time of the first Wing Commander being released. The answer back then was different, because gaming technology was not as mature or as capable as it is now. The right elements of filmmaking are coming through into games, due to the power of the consoles. Visual and audio fidelity can finally be created with the impact and believability seen in films. This has led to cinematics being more reactive to players' actions, and thus delivering what we all wanted to deliver with Wing Commander, but at the time, didn't quite hit the mark.
"Our attitude is to embrace such convergence," Roberts says. "We can only learn from a parallel industry."
Janet Hetherington is a freelance writer and cartoonist. She shares a studio in Ottawa, Canada, with artist Ronn Sutton and a ginger cat, Heidi.