Adrian Pennington explores the challenges and possibilities in developing more photoreal games in the U.K. with the new HD platforms.
Once the preserve of PC gamers, high-resolution graphics are a key weapon in the console wars being fought by marketeers at Sony and Microsoft. For artists and players, HD means a generational leap from robotic characters and blocky sets toward cinematic photorealism. For publishers and developers, though, next-generation hardware offers immense possibilities at considerable cost.
The biggest challenge is one of expectation, admits David Braben, co-author of legendary PC game Elite and founder of U.K.-based developer Frontier Developments. The HD display is a magic magnifying glass on what we do. The higher the resolution, the more perfect the graphics have to be because you can see where the cracks are. HD opens up whole new sets of tools for making games look real. Its an exciting time to be in game development especially if youre an artist.
David Amor, creative director at Brightons Relentless Software, agrees: You could get away with lower polygon counts since CRTs inherently blur lower-res graphics. LCD and plasma screens mean you have to pay greater attention to the underlying models. Where we had skin, we now have to think about pores.
Microsoft expects 160 Xbox HD titles on release by Christmas. Some, like Gran Turismo HD (for the PS3) and Call of Duty 2, are essentially up-res'ed ports of an existing PC game. Others like Call of Duty 3 and The Elder Scrolls IV: Oblivion have been overhauled to take account of the hardwares greater processing power. Epics forthcoming Gears of War for the 360 is widely anticipated to be the first title to really push the potential of the next-generation.
All next-gen games will be produced to 720p, although developers for the PS3 could feasibly build to 1080p -- a true HD capability much hyped by Sony in the run up to the 360s launch. A progressive scan would produce a smoother image during horizontal pans and movements but most developers have dropped support for the spec -- initially at least -- because the additional processing time would make all but the most premium project unmanageable.
According to Rick Gibson of analysts Games Investor, Even with native 1080p, hardware developers cant render complex animations across the entire screen without the whole processor slowing down. If you switch a crowd scene from 720p to 1080p, youd have to significantly reduce the amount of animation in the shot. Scaling to 1080i is less problematic, although developers might consider adding anti-aliasing (an effect that smoothes the appearance of jagged, pixilated edges).
The biggest factor for any developer is how many pixels you can put on screen and process at any time, says Amor. The current spec equates to 160,000 pixels on screen. Moving to 720p (1280x720) means a million pixels and at 1080 (1920x1080) it doubles -- a massive step up.
For Epics Unreal Tournament 2004, each character required 3,000 polygons, each vehicle 2,000. For UT 2007, a character represents 4 million polygons, a vehicle 3.5 million. This requires an exponential leap in the amount of artwork -- double the amount of time and personnel over current production.
Our biggest quandary is how much detail you can create while maintaining an effective workflow, says Steve Thompson, art direction manager at Blitz Games, a Warwickshire-based indie, which has already completed three Xbox 360 games for Burger King. You could compromise quality if you produce lots of assets. Theres no point spending a week finessing the finest virtual chair in the world if you dont have time to build out the rest of the room to 720p.
Karl Hilton, art director at Nottinghams Free Radical Design, adds: We have to do more concept art and previsualisation because assets take much longer to build. We need to be much more convinced before we start about a concept and how were going to build it because the consequences to workflow are huge.
The increase in artwork is the main factor in doubled development costs from $3-6 million per title. Costs in excess of $20 million for premium games wont be uncommon. The cost effectively means a reduction in the number of games made, adds Gibson. Artwork previously handled in house will be outsourced to lower cost centers in India, China and Eastern Europe and elements of larger games, such as different levels, will be shared among several houses across the U.S., Japan and Europe.
Nonetheless, next-gen excites artists who can feature a higher number of characters on screen, greater resolutions for textures and worlds and deploy various visual effects for realism which until now have been limited.
Wed use 2D bitmaps or wallpaper to texture shapes and wrap around 3D models, says Tim Jones the design chief at Rebellion, the creator of PSP games James Bond: From Russia with Love and Miami Vice. HD means youre more reliant on original source material.
Jones recently visited Californias giant redwoods, taking dozens of 12 megapixel stills in preparation for an undisclosed next-gen project. Were beyond the point now where we can rely on standard texture material or drawn artwork, he says. At these resolutions we need to generate artwork that means you can put your nose up to the bark and see all the bumps and textures. That increases your artwork load significantly.
Free Radical Design has been selected by Ubisoft to create its first-person shooter game Haze for the Xbox 360 and PS3 and by LucasArts for a secretive next-gen title. Hilton describes Haze as a gritty representation of how bad war can be. Its set in South America and hes taken high-res close-up photography of exotic plants at a local botanical garden to replicate an Amazonian environment.
Digital stills rather than video provide the reference source for constructing environments -- in this case the position, color and shape of weeds and shrubs, plant height, how light refracts through the canopy and dust. Stills also provide textural information, which feeds into a process called normal mapping. This technique involves the creation of a hi-res texture map overlaid on a lower resolution 3D object to provide the impression of greater quality without increasing the polycount.
We can use real world lighting models with higher dynamic range, shadows and soft light to try and emulate solidity and depth whereas before we only had a crude approximation, explains Jones.
Most of today's games only support point light sources, and completely ignore reflecting light as a source of illumination. A recently developed method known as spherical harmonics lighting -- which calculates the lighting on 3D models from any type of source in realtime -- is now finding its way into next-gen games.
We can now mimic the feeling of moving into a darkened room where your eyes take time to adjust to the new environment, reports Thompson. Or deploy volumetric lighting to create the effect of light scattering in mist, fog or dust.
Lighting Takes Center Stage
Developers are increasingly adopting filmmaking techniques such as real world lighting to bring games closer to cinematic photorealism. These include the creation of a stylized look generated in post processing.
By looking at the way negatives are developed, we could simulate different film grains and add onscreen vignettes, says Braben. In theory, we could create a film noir with high granularity and subtleties of contrast, claims Thompson.
Its something we tried on our PS2 Miami Vice game by introducing stronger blue tones, adds Jones. It provides a unique feel to the game.
Brabens film influences include Ridley Scotts Thelma and Louise, in which the back-lit effects were performed in-camera and Spielbergs Saving Private Ryan, notable for its desaturated color applied in post and now the de-facto style for the WWII film -- and game -- genre.
As we render a scene we can use motion blur or camera shake -- effects which we could create before but now in a much more sophisticated way, Braben explains. We could potentially track the mood of a game so that if tension is racked-up the camera becomes handheld and furtive. Other effects might mimic the high shutter speed and fractured image of fight scenes in Gladiator. People are subconsciously attuned to this visual style from seeing it in the cinema.
Traditionally, the lead animator oversees rendering, but Braben sees the role expanding and perhaps splitting with another animator adopting responsibilities similar to that of a cinematographer. In the film world, camera work is part of the directors remit but with games the camera is rule based, he says. You cant say you want a scene to be filmed this way because players want some control over the camera to pan around or zoom into an environment in ways you hadnt envisaged. This makes realistic lighting more complex.
Thomson has even appointed cinematographer David Knight (The Hitch and steadicam operator on Stealth) to retrain the 130 strong Blitz crew. If youre an artist you need to have knowledge of light, says Thompson. Theres so much more we can do with pixel shaders and light sculpting to create mood. In terms of cutscenes, we need to better understand camera work, framing and composition.
Monitoring HD output is equally important if developers arent to follow the unfortunate lead of Ubisoft, which failed to adequately test the lighting for its Xbox 360 version of Peter Jacksons King Kong. The production was criticized for several dark, almost unplayable, scenes when played over standard TVs and the developer had to issue an apology. Just as make-up artists have to rethink for HDTV there are probably a host of things we havent considered but which will become apparent on HD playback, remarks Amor.
The Uncanny Valley
True photorealism remains some way off with a chief hurdle being the depiction of believable humans. Human characters such as those featured in Final Fantasy and The Polar Express often risk emotional detachment or even appearing sinister. This is why Brad Bird successfully adopted a more graphically stylized approach to The Incredibles at Pixar.
In game circles this is known as the Uncanny Valley, a term coined by Japanese roboticist Masahiro Mori in the 1970s to refer to the idea that the closer you get to realistic expression the scarier a face can seem.
Its not so much errors but a lack of subtlety that is noticeable, says Braben. With a face at 20 pixels across, youre only getting a vaudeville expression. But at 100 pixels, you have to generate much more subtlety. How do you show fear or surprise when the response of facial muscles is so similar?
Frontier, he suggests, may have cracked the problem for their next-gen thriller The Outsider, which is being produced simultaneously as a feature film. Its a combination of underlying structure, facial muscles and the conveyance of the right emotion to produce a layered subtlety. When youre talking to someone theres a wealth of hidden responses and its the combination of those that help us read the face as human. Its very complex but were getting close.
Parallels with film production abound. Current games feature multilayered audio tracks but future gameplay is likely to be enhanced with features familiar to any film soundtrack. We can create effects falling off as an object moves into the distance or add reverb, echo or chorusing, insists Jones.
Sound is 50% of the game and can play a much more powerful role now the hardware allows us to add more layers, observes Jeremy Luyties, lead designer for Activisions Xbox 360 game Call of Duty 3. The level of detail goes down so far in COD3, he suggests, to include crickets on tall grass during quieter moments. Typically when there are explosions its difficult for an individual player, particularly in multi-player mode, to differentiate sound. We now have a battle action system that will trigger certain audio information if the player is pointed in the right position. They could for instance overhear enemy conversations or be forewarned by an ally of attack. Well use all the tricks of filmmaking to mix dynamic soundscapes.
Like vfx facilities, games developers routinely write their own patches to base animation software such as Maya and 3ds Max. But if HD platforms are to be stretched anywhere near their potential, greater resources must be spent on bespoke code for environments (particles, fluids, shaders) and character animation (fur, walk cycles, facial movement). Developers wanting to work cross-platform need to design workflows that enable such assets to be swapped around efficiently. Blitz has even designed its own middleware platform so it is not tied to a manufacturers spec.
We want to achieve a sensory overload, Luyties declares of COD3. When you walk on grass, youll trample it and hear it move. Theres soft cover like hay or wood, which will be destroyed and forces you to a new location. Activision developer Treyarch used laser scanning to take high resolution 360-degree images of actors faces and mannequins wearing authentic WWII uniforms with the information down to fabric crinkles imported to 3D models as texture maps.
Previously youd make one pass for texture and that would be good enough. Now you need to multipass. A normal mapping pass, a specular pass, a pass for collisions, diffuse, reflection, ambient or shadow information. If you go up to a wall in COD3 you can see the groves and indentations in the brick.
While theres no doubt next-gen graphics can be visually stunning, it may be that the greater revolution is being provided by Nintendo. Its DS and Wii platforms deliberately forsake HD to concentrate on new forms of gameplay, requiring motion sensing rather than joystick and buttons.
HD graphics will deliver small effects in immersion and realism but it wont deliver vast audiences, argues Gibson. Growth in the videogames market is in fact slowing because it has reached the natural boundaries of its constituency, which are early adopter males playing violent games. If developers want to attract new audiences, whats more important are intuitive interfaces and a fundamentally different approach to gameplay.
Adrian Pennington is a U.K.-based freelance writer and editor of animation magazine Imagine.