In this month's edition of "The Digital Eye," Autodesk's Maurice Patel explores how gaming is driving VFX and Design Visualization through performance, interaction and immersion.
It’s a well-documented fact that the games industry has exploded to become one of the fastest growing sectors in entertainment, with $18.85 billion in U.S. sales alone in 2007. Games software grew by 28% last year, and it’s remarkable to think that Nintendo has comparable market capitalization to Disney, at approximately $60 billion.
In addition to this explosive growth, gaming is now challenging film and television as a key driver of innovation in computer hardware and software, mainly because of the significant challenges of creating the next-generation of high-resolution, realtime, interactive games. As a result, we’re starting to see gaming technology being applied to other content creation sectors, most notably in visual effects for television and film, and in architectural visualization.
The requirements of the games industry are driving innovation in three key areas: performance, interaction and immersion. But what exactly are these requirements and how are they driving innovation across the visual effects and visualization industries?
Any game worth its salt needs to run in realtime, and today, with the Xbox 360 and PlayStation 3, they have to do so in high-definition! This requires rendering large and complex 3D scenes at very high-resolution, at 30-60 frames-per-second, and that in turn requires a whole lot more hardware horsepower.
The games industry is becoming the beachhead for advances in computer technology. Take the Sony PlayStation 3 -- it is a remarkable example of a low-cost, high-performance graphics system. The heart of the system is the eight-core Cell processor, a multi-core chip with eight Synergistic Processing Elements (SPEs) capable of delivering up to 2 teraFLOPs or 20 times the performance of a dual-core workstation processor. With transistor gates approaching the thickness of an atom, clock speeds can’t increase much further.
Consequently, multi-core parallel processing will be the future of computer performance in all graphics applications. Similarly, the needs of the games industry are driving rapid innovation in graphics processing (GPUs) and architectures. The need to render 3D in realtime and high quality is a key driver of the current advances being made in both consumer and professional solutions for digital content creation. Recent acquisitions such as AMD’s purchase of ATI, Intel’s purchase of Havok, NVIDIA’s purchase of Ageia and Autodesk’s announcement of its intent to acquire Kynogon, further illustrate the strategic significance of games technology and graphics performance to the future of computing.
Interaction is another key requirement for games and a significant driver of innovation. The Nintendo Wii or Harmonix’s Rock Band are significantly changing the demographics of the average game-player as well as the way in which we think of games as an entertainment experience. These and other changes are challenging classic computer paradigms. While their applications are only marginal in the visual effects business today, they are part of a broader shift to more interactive solutions. Even today, at the bleeding edge of visual effects, we are seeing innovative uses of technologies such as Autodesk MotionBuilder to provide on-set interaction to such directors as James Cameron, who want to direct CG characters in real and synthetic environments (Avatar). In fact, the holy grail of effective direction of CG characters is on a convergent path with gaming technology.
Finally, the need to create a truly immersive game experience is another key driver of innovation. Immersive experiences require suspension of disbelief. The participant, whether they are playing a game or watching a movie, needs to engage with the content in a meaningful way. And there are many technical obstacles to immersion, including the difficulties of photorealistic rendering, the uncanny valley phenomena (believable digital characters are hard to create) and the inherent complexity of even simple dynamic systems (fluids, for example).
Today’s technology is far from being able to solve these problems easily. There is still significant innovation happening in this area both in film visual effects production and games authoring. And it is not obvious that the solutions to these problems will ever be able to be fully solved algorithmically.
One key area of innovation has been that of games middleware, which is also starting to find some broader applications. Middleware helps simplify complex problems by providing a high-level architecture that can solve specific simulations from physics to path-finding and animation. The advantage is that the solution is ready-made for the user who does not have to worry about programming complex behaviors from scratch. Instead, it can be applied directly to the elements of the game so that the behavior the artist creates in the authoring environment will execute identically in run-time.
Ubisoft recently won the Academy of Interactive Arts and Sciences award for Outstanding Achievement in Animation for its game Assassin’s Creed. Ubisoft created more than 12,000 programmed animations for the hero character, Altair, to make him more realistic. Using Autodesk’s HumanIK middleware they were able to ensure that his hands and feet behaved realistically when he runs, fights, climbs and even rides a horse. Middleware can also solve challenging artificial intelligence (AI) problems when applied to characters enabling them to behave in more complex and believable ways.
Middleware is making significant progress in terms of both its capability and its realism. As this continues to improve we will see more and more use of it in other areas such as visual effects. Already many film visual effects facilities (such as ILM) are starting to experiment with games pipelines and develop the core expertise that will be required for future production pipelines.
Gaming and Filmmaking
Innovators such as Cameron and Lightstorm are using game-like concepts and technologies in filmmaking to enable new forms of creativity. There is a strong interest in the potential of this technology to make virtual cinematography a reality for the director on-set and not just the post-production process it is typically associated with since John Gaeta’s work on The Matrix. Cameron’s upcoming movie, Avatar, is a benchmark movie in its innovative use of live action and virtual production techniques, including significant interaction between live and CG actors.
Traditionally, it has been impossible to direct the acting of a CG character on set. This direction typically took place during the visual effects phase, causing inefficiencies and creative bottlenecks and giving the director less creative control over a CG actor’s behavior. As a result, a lot of expensive vfx shots can end up on the cutting room floor. In the case of Avatar, Lightstorm is revolutionizing this process using a unique workflow developed on MotionBuilder. This allows Cameron to simultaneously direct both the human and the digital actors on set and review their interaction in realtime.
The ways in which we create media are changing not only in film, but also in design visualization and gaming technology. It is affecting how architects and designers explore, validate and communicate their ideas.
Gaming and Architectural Visualization
In many ways, architects and designers are becoming digital filmmakers too. As 3D graphics technology becomes more accessible and prevalent, more and more designers and architects are using it to express their ideas with greater impact. It’s far easier to communicate a building design to a client when it is in context -- e.g., within a landscape or cityscape, complete with people and cars -- just as it would appear in the real world. And the use of animation to communicate the passing of time is an even more powerful communication and analysis tool.
Using animation techniques, designers and architects can walk their clients through buildings, explore how light varies in a structure as the day passes and even how people will interact with the building. However, while rendered animations are passive to the client, games technology can make the whole experience interactive. Clients can “walk” through, experience and interact with designs freely, the same way they would interact with a videogame.
HKS Architects licensed Epic’s Unreal Engine for exactly this purpose. They were able to walk their clients through buildings created in Autodesk Revit. Clients can control the point of view and navigate their way around the building in realtime using a joystick. HKS used the Unreal Engine to create a fully explorable, reconfigurable 3D model of the new Dallas Cowboys Stadium.
The games industry is a significant driver of innovation in computer hardware, software and 3D animation. Autodesk is in the unique position of being able to blend the best practices from game development, film production and design visualization, putting it at the center of 3D innovation across entertainment and design sectors.
As the lines across content creation disciplines are further blurred, it will be very exciting to see the growing role that realtime engines, gaming middleware and character tools play in delivering efficiencies and creative flexibility to filmmakers and architects alike.
Maurice Patel is senior industry manager for Autodesk’s Media & Ent. division. He is responsible for the company’s presence across the film, television, games and design visualization segments.