The Game Developers Conference (GDC) this year took place in San Francisco, at the Moscone Convention Center; there are now so many people attending this event that it has probably outgrown the smaller San Jose Convention Center forever. There were two show floors, a huge area devoted to recruiting, and special segments for Serious Games, mobile games, casual games, women's games, independently produced games and God knows what else, with energetic streams of people moving nonstop between them all.
The explosive growth of the GDC is in stark contrast to the shrinkage of the E3 (Electronic Entertainment Expo), which is imploding this year. We can only hope that the GDC does not absorb the culture of the E3 -- an event all about the presidents, a big-money slugfest between major gaming companies that emphasized major franchises and profit centers and seldom mentioned the members of the development teams that worked so hard on those games.
As of right now, the GDC still retains much of the charm of the days when it was called the Computer Game Developers Conference. This is evident in the awards shows, where the audience may cheer an indie game that's self-distributed more loudly than a blockbuster for a major game platform.
Although it's almost impossible to give a comprehensive review of the whole GDC, with its many different tracks and special interest groups, here are some snapshots from the show to give you an idea of status and trends in the industry.
The two main keynotes, with crowds waiting outside for blocks to fill the 5,000-seat auditorium, were given by Phil Harrison of Sony Computer Ent. (SCE) and Shigeru Miyamoto for Nintendo. They were emblematic of the trends of the state of the gaming industry, and where two of the major players see it heading.
The Sony presentation was the more colorful one, starting out with giant soccer balls released upon the audience, which proceeded to form teams and played a game with them, with scores displayed on monitors. The point of all this became apparent: introduce a game into a social setting, and communities form to play it.
The point of the presentation was to introduce what Harrison calls the "Game 3.0" concept, a sequel to Wall Street's "Web 2.0" moniker (which refers to social phenomena such as YouTube, MySpace, Amazon and the Wikipedia). Game 3.0 stands in contrast to "Game 1.0," which consisted of individual, static, disconnected games -- games that were played by players who had no influence on the shape or format of gameplay. What Game 3.0 is about is audience participation and "emergent" gameplay, according to Harrison. "This is about the connected device," he said. "Such devices are built on open standards and powered by active communities. This is about community, about collaboration."
A slide entitled "Game 3.0" appeared on a screen, surrounded by terms that sound like they belong in a college sociology class more than at a game developers conference, terms such as community, social content creation, localization, customization, service. Harrison went on to explain how game players would interact with a game environment, customize it and make it their own.
Emergent entertainment is becoming a buzzword of the game community, along with SN (Social Network) references. What enables all of this is Sony's new PlayStation Home online system, which is clearly meant to compete with Nintendo's and Microsoft Xbox's online services, and which bears a remarkable similarity to Second Life, a persistent world that allows members to create, trade and sell 3D virtual items such as clothing, cars, buildings and customized avatar bodies, faces and movements. Sony's world is intended to be for more than PlayStations -- the intent is to make it accessible to other platforms such as PSPs as well as PCs, and even competitor's offerings such as Nintendo Wii.
Harrison demonstrated the many options for customizing a player's avatar, including clothing and an apparently limitless number of faces. Advertising will play a large part in Home. There is an option for voice communication, though the voices sound more computer generated than natural. Home also includes a games lounge, for players to meet and get to know each other while playing casual games together, such as pool, bowling and arcade games. Harrison pointed out that the arcade games are user definable, meaning that the player can choose preferred games and download them into the space. Users can also have private spaces of their own -- every user can have an apartment that he/she can customize by changing wallpaper and adding furniture (some of it is free, some is linked to particular games, and some is labeled as "premium" items which can be purchased from the Sony store).
Content can move between the real world and the Sony Home world. A photograph, for instance, can be transferred via memory stick to a PlayStation 3, and then into a Home apartment, to appear inside a picture frame on the wall. As is the case in Second Life, real estate is a booming business: larger apartments can be bought, with features such as pool tables and television sets that can display either prerecorded video or content that the user has created. There are also in-world tools for creating games, and sharing those games with others. Just as YouTube has an area that shows each day's most popular video clips, Sony Home has an area that shows the currently highest rated user-created games.
Modding a game (changing the game's characters and environment) has been around for quite a while -- many expert players are so used to modding games, especially First Person Shooters, that they refer to un-modded games as "vanilla" versions, or "V" for short, such as VQ3 (Vanilla Quake 3). However, modding in the past has required a goodly amount of skill. What Sony is proposing with its Home is a new possibility: modifying games, or even creating brand new ones, by communities of ordinary players, to express their creativity -- and have an excuse for getting together. Planned future features include pets and music/singing areas. This should be a fun development to watch.
Nintendo also had a flashy presentation. Shigeru Miyamoto's keynote address heavily referenced the company's wildly successful Wii, with its unique controller that captures player movements and translates it into in-game movements, such as the swing of a golf club. Miyamoto stressed that designers need to get to know their audiences better, that they keep making the same mistakes because they are too wrapped up in their own vision of what a game should be like. He noted that one common assumption is that users only like violent games. Nintendo has been trying to broaden its audience recently, to make games that women also enjoy. Miyamoto spoke of Nintendo's toolset for customizing avatar faces and other features, what is called the "Mii" experience; the company plans on creating a Wii channel that lets people share and compare their creations. He explained that he has tried out all of his recent game designs on his wife, to determine what women might like, and proudly noted that his "Wife-O-Meter" has scored very highly with recent Nintendo releases such as Ocarina of Time and Brain Age.
Toolsets for Game Producers
"With ever greater levels of detail in each game, but no more time to produce them, there is more and more attention being paid to how to improve the production pipeline, according to Rob Hoffman of Autodesk, the home of Maya (version 8.5), 3ds Max (version 9), and other game toolsets. One major improvement in Maya is the ability to use the popular Python scripting language, which is widely used for vfx in the film industry. Whereas Maya also has its own scripting language, MEL (Maya Embedded Language), the software can now relate to either MEL or Python equally well. "Having Python support available in Maya means studios can use their existing tools directly within Maya, rather than having to write glue code to bind Maya to their pipelines," Hoffman noted. "This allows them to develop new node and command plug-ins in a fraction of the time they might have needed otherwise."
A major focus of Autodesk this year is interoperability, the capability to port resources created in one software package into the workflow being created into another software package. Although the industry has promised "seamless" interchange between different formats in the past, this has never really been the case in actual production environments. Autodesk is working very hard to turn this around.
One approach the company suggests for interoperability is to use fbx, its universal file exchange format, which offers easy import and export not only between different Autodesk products, but also other toolsets such as XSI and LightWave that game producers might prefer. "This is really important for today's workplace, which often involves company teams spread around the globe," said Hoffman. Another improvement in the company's toolsets is the ability to use Japanese, instead of just English, for the many game development houses in Japan that would prefer to use their native language.
Best Games of the Show
So, which game was the best one at this year's show? Each year, IGDA, the International Game Developers Assoc., holds an open nomination program, to let the game development teams be judged by their peers, and gives out its Game Developers Choice Awards. The game chosen as best this year was Gears of War (GOW for short), produced by Epic Games for Microsoft Game Studios, produced by the team of Cliff Bleszinski, Michael Capps and Rod Fergusson. GOW also got the awards for best technology and best visual arts. It was emblematic of the GDC that the game designers were honored specifically, rather than the just the game publisher -- and that one of those designers used all of his time on stage praising the other four nominated games (Okami, The Elder Scrolls IV: Oblivion, The Legend of Zelda: Twilight Princess and Wii Sports), something you are not likely to see at the Oscars.
Best audio for a game was for Guitar Hero, again, this time for its version II incarnation, produced by Harmonix Music Systems for RedOctane. The best game design went to (with wild cheers from the audience) Wii Sports, by Nintendo.
The Independent Games Festival awarded prizes for games created with more modest budgets, but received just as enthusiastically by the audience. Bit Blot's dreamlike 2D underwater adventure game Aquaria won top honors, as well as a $20,000 prize, big bucks for a small indie team. The awards for design innovation and for audio were handed to Everyday Shooter, by Queasy Games, and that for technical excellence went to Bang!Howdy! from Three Rings. The audience award went to Castle Crashers, while the best student game prize went to Toblo, by DigiPen Institute of Technology.
A relatively new award in the industry, but reflecting a growing area of interest, was the best mod game prize, won by Cut Corner Company Productions for its corporate office adventure game, Weekday Warrior, a modded adaptation of the popular Half-Life 2 game.
It was hard to tell the winners from the losers at the after-hours parties, which appeared to have lower budgets than previous years, but the same high spirits. Partygoers from studios large and small sat side by side as equals, consuming both of their major food groups (Pepsis and pizza) and discussing every conceivable topic related to game creation as if their lives depended on it.
Cool New Stuff
Location Based Games or Location Based Services (LBGs or LBSs) are an interesting new application. These games use actual locations as the gaming environment, and the player has to physically move his/her body along with the mobile gaming device to get around the game area. Could this be the end of the couch potato? "Location based games are a great way to get out and explore a town or environment, and to get some exercise," said Alex Tikhman, the founder of Tik Games, which has two such games, Jewel Chaser and Geo Universe, on the market. Autodesk is a major supporter of location based applications, with a corporate group, Autodesk Location Services, dedicated to this for games and other applications. The company offers its developers a wide range of services to help them get into the market, including getting locations of players and points of interest (via triangulation from cell phone towers), downloads of maps, aerial photos and other geographical information, and middleware for applications. One application offered, for instance, is Chaperone, which allows a subscriber such as a parent to draw a circle, called a "fence," around an area of interest such as a school. When a child crosses the fence, the parent is notified via an SMS message that the kid is now approaching school; when the child crosses the fence leaving school, the parent gets another message. The same basic system could obviously also be used for potential customers approaching a restaurant, for visitors approaching a tourist attraction, or for game players approaching an objective in a gaming environment.
Another interesting development is Organic Motion's new motion capture system, an unusual MoCap setup that uses no markers (neither magnetic points, white puffballs or lights) or body suits on the person whose motion is being recorded. Instead, 10 specially-equipped video cameras track the entire body in three dimensions, and the resulting complete data can be displayed in realtime. The advantage of tracking a whole form, instead of points along his/her surface, is that there is no complicated calculation to go through to digitally interpolate what the actual body was doing, a painful process that often takes a long time, is fraught with error and inevitably leads to re-takes.
"You don't have to have an engineer to run this system, you can have your artist run it," said Andrew Tschesnok, Organic Motion's ceo. Indeed, during the demo, held next to a staging area with a white background, with cameras connected to a desktop workstation and display, an animator was both capturing the action -- and directing the model for different and subtle changes in movement. The model was in ordinary street clothes -- no need for the black mime-like suits and precise reflective marker positioning we have all grown familiar with. The system is first calibrated to recognize the form (e.g., a human), and then defines natural joints and edges on the body and tracks those, rather than dots or other markers; all this takes a couple of minutes. After the person started moving, the system streamed the data directly into Autodesk's MotionBuilder, and the display of the 3D moving digital form was essentially realtime. In addition to capturing pure motion, the system can also capture meshes and surface textures.
"The ability for an artist to direct the session, and make changes on the spot instead of hours or days later, will not only save time and money for the production pipeline, but also allow greater artistic freedom," noted Jonathan Rand, Organic Motion's president. "The ability to do on-location 'What if we did this?' explorations in realtime will enable the artist to come up with creative avenues that would not be possible when he or she is physically removed from the MoCap process."
Watching the 3D capture of a normal-looking person with this system, one can visualize a future application like that in Neal Stephenson's book Diamond Age, where a person is mocapped in real-life situations, and the mocap data is immediately transferred into a virtual world, allowing the person's avatar to interact in a very personal way with ongoing games or machinima-type movies. Organic Motion's system should be a real boon to entertainment and scientific MoCap applications. It will be available for around $80K in September. The present system MoCaps only one body; two or more body capability is planned for next year.
Major Trends in the Game Industry
Last year's article on the GDC quoted Mitchell Davis, ceo of in-game advertising company Massive, which was a pioneer in this rapidly growing field. Davis is now a happy camper -- his young 80-person company was bought by Microsoft for hundreds of millions, and you can now see his technology being applied throughout the Microsoft gaming universe. In-game advertising, which started with simple billboard ads in car driving games, is evolving into "fluid" or "seamless" mode, where advertising material can be applied to almost any surface in the game. For instance, a Coca-Cola ad buy could result in Coke ads in a game space on billboards, the fronts of vending machines, on T-shirts of characters and on cans of refreshment used in the game. Since such games are online, the ad-covered surfaces can be refreshed as needed, and even customized for certain types of players or their home locations.
Resolution of game environments and characters keeps getting better, of course, and in some cases leads to unrealistic expectations, especially for facial features, which the human brain is hardwired to recognize in very fine detail. The term "Uncanny Valley," which has been applied to CGI movies and TV shows, is now being heard in gaming, as well. The term, introduced by Japanese roboticist Masahiro Mori, refers to the emotional response of humans to computer created characters such as robots or CG characters in movies. Humans tend to react favorably to somewhat realistic characters, with the subconscious understanding that these are, after all, artificial. When characters get "almost human," people become very critical, and instead of appreciating the features that have been skillfully created, tend to focus on anything that is still non-human. Remember films such as The Polar Express -- most reviews did not rave about the parts that were almost photorealistic, or how skillfully facial performance capture had been used, but focused on the "empty eyes" or "unrealistic smiles" of the characters. This area of repulsive response aroused by a CG character that lies between "barely human" and "almost human" is called the Uncanny Valley. This puts an extra strain on game producers, who have to either stay with fantasy-driven creatures such as monsters (for which people do not have such high expectations) or try to cross the chasm and produce very lifelike human characters.
One sign of how realistic games are becoming was a SRO lecture entitled, "The Imago Effect: The Psychology of Avatars."
Serious Games, games used for learning, continue to grow in importance. Once limited to low-cost productions generated by universities for a few hundred thousand dollars, some Serious Games now get serious budgets of $5-10 million. Two days at the GDC were dedicated to Serious Games -- this will be covered in a future article.
A continuing trend is that strategic partnerships are becoming more and more crucial to staying in touch with emerging game developer needs. Companies such as Intel and NVIDIA, which used to merely supply hardware for gamers, now have staffs that coordinate with major game producers as well as workstation manufacturers. Autodesk, in addition to partnering with hardware makers, has very active liaison teams with EA, THQ, Sony and other game producers.
One sure sign that the lines between films and games keep getting more blurred is the increased poaching by the game producers of film industry experts. There is a growing practice of hiring film studio producers, directors, artists and animators by game houses. This was cited by several sources as a problem of making movies in Los Angeles -- because so much of the talent was being grabbed by the many game producers in the area.
Another sign of the blurring between games and films is the generation of cinematics (also called "cut scenes") by in-game software. Cinematics are in essence short movies that introduce different scenes or levels in a game; until recently, these were created outside of the game with film vfx toolsets and approaches, and were added in as pre-rendered scenes into the game. Because of the much higher quality of the cinematics, scenes used to promote the game were always clips from the cinematics sequences. The fact that AAA games such as Gears of War (whose feature film rights have just been snared by New Line) can now create cinematics in-game with game creation tools says volumes about how far gaming software and hardware has come in the last couple of years.
Not that long ago, many games were produced via a single 3D toolset, such as 3ds Max. Nowadays it is common for one production house to use many toolsets, depending on artists' tastes or what works best for a certain part of the production pipeline. Because translating from the format of one toolset into that of another has invariably resulted in glitches, there is a high priority effort of major software producers such as Autodesk and Avid to produce seamless means of porting files from one format into another -- it is universally agreed that when this is finally achieved it will have major effects on the ability to meet production deadlines.
The Gaming continues to expand and evolve into many different directions, with dramatic growth in online, mobile, independent and serious games. As the lines between games and other media such as feature films are becoming increasingly blurred, there is increasing effort to merge the two pipelines, an effort that still appears to be problematic for most studio chiefs.
New revenue models are emerging, such as the increasing sale within gaming environments of digital assets such as clothing, furniture and real estate. There is an increasing emphasis on allowing game players to modify or even create their own customized content, in the hope that becoming a stakeholder will increase the retention of such clients. Finally, as the gaming field is growing, there are more spinoffs from the GDC -- look for an increasing number of other GDC conferences in the U.S. (such as the very dynamic one in Austin, Texas), Europe and Asia.
The heart of the original GDC appears to continue beating strongly in this much larger venue, and the creative spirit of the videogame industry -- and of the GDC -- is alive and well.
Christopher Harz is an executive consultant for new media. He has produced videogames for films such as Spawn, The Fifth Element, Titanic and Lost in Space. As Perceptronics svp of program development, Harz helped build the first massively multiplayer online game worlds, including the $240 million 3-D SIMNET. He worked on C3I, combat robots and war gaming at the RAND Corp., the military think tank.