Henry Turner investigates the advancing and detailed production of gaming cinematics.
Game technology has advanced incredibly over the last decade. Todays games incorporate cinematic sequences to both entice the viewer with glimpses of the game, and also develop the characters, situations and stakes of play. It is a fascinating world that is becoming more open-ended as time progresses, and gaming cinematics plays an increasingly vital role.
What are Cinematics?
We define cinematics as our bookends to a game level that help the storyline along, says Dev Madan of Sucker Punch, creators of Sly Cooper. We call them intros and outros. With the intros, we set the boss up, give the player an idea of who he or she is, what their back story is and paint a location to give the sense that the character is traveling to somewhere exotic. We create acts one and three of a three-part story, and the player controls act two. Act one is the setup, the who-what-where-when-and why, and act three is the resolve.
Madan goes on to say that cinematic sequences allow gaming artists more latitude in creating characters. The cinematic sequence is pure storytelling- you get to control all the things that you have to surrender in a gameplay experience things like camera moves, the pace, how a character delivers a line and timing. Cut scenes are an opportunity for us to portray the character in a number of sequences to build his/her character to the player.
Sonys Brian Johnson (of the Cinematics Solutions Group) has to his credit his work on the photorealistic stadium intros for NFL games, and the cinematics for the Navy Seals titles. As far as creating a realistic cinematics experience is concerned, Johnson says, We shoot for realism, especially in the sports line. The producers all say, hands down, give us reality. They want to see the player move and breath and have character. If you watch one of our football games and you take a look down the hallway from the kitchen into the living room and see the game, I think youd be hard pressed to see the differences from reality. And once we move to PlayStation 3 it will be very difficult to discern any differences its going to get just incredible with the next incarnation.
However, Sean Cushing of Pixel Liberation Front recognizes different objectives. Weve focused mainly on game trailers which are used at E3 as part of the marketing package for new games. Rather than striving for total realism, Cushing points out that it is often a better idea to use cinematics that are closer in quality to the actual gameplay. Some trailers in the past have gone too far with realism, and people who buy the games and play them are disappointed because the graphics arent the same quality. So were very conscious when we do a trailer to increase the quality about twenty percent, but not so much that we mislead the player regarding the graphics they will see during the gameplay.
An example of realism is Onimusha 3, a PlayStation 2 game with an opening cinematic created by ROBOT Communications in Tokyo, Japan, utilizing 3ds max as a rare cinematics tool. The Onimusha 3 cinematic portrays battles between samurai spacemen and skeletal warriors that plays like a cross between Crouching Tiger, Hidden Dragon and The Lord of the Rings.
In the final episode of the Onimusha series, our intension was to produce the best 3D cinematic opening entertainment ever, and captivate the game players, said Ikuo Nishii from ROBOT. Takashi Yamazaki, a film director of Returner and Donnie Yen, legendary martial arts choreographers in Hollywood and Hong Kong, co-directed the production with Shirogumi and Kaihei Hayano, who both completed CG creations for both Onimusha 2 and now, Onimusha 3.
A year in the making, the opening sequence also required 10 days of motion capture sessions. Using 3ds max and Character Studio software, the ROBOT team created a visually rich CG universe, right down to the realistic highlights on the black hair of the characters, courtesy of the Shag Hair plug-in that worked so well with 3ds max.
Certainly the ROBOT team has created an opening sequence that arguably rivals effects-laden Hollywood blockbusters. But as Dev Madan points out, it is usually the best to suit the style to the subject. Ive worked in a variety of styles over the years- in comics, illustration and animation. And while they all have their pluses and minuses, I would have to say that while I dont have a particular favorite style, Ive always been drawn to good clean shapes and design the more graphic and stylized with a good sense of personality. The biggest drawback I can think of for any particular style would be one that overwhelms or gets in the way of telling a story.
Gaming and Movies
Sonys Brian Johnson observes a change in ambition among game creators. A few years ago everybody in the games industry wanted to get into movies, and now its starting to swing back the other way. All these high-end people in the visual effects industry are looking at whats coming down the line, especially in PlayStation 3. They see theres something brand new there and creative. Still, Johnson acknowledges that many gaming cinematics techniques are derived from motion picture visual effects. The technology a lot of times comes from the movie industry, and then we have to really dumb it down and fake it for the games, because the processing power isnt there. So innovations on a lot of things happen in the game world, where you are forced to be creative with your outlets to get a certain look. You have to cheat things. For example, you cant use real shadows, you must use fake shadows. While looking for short cuts we come up with innovations.
Johnson maintains that lines of communication between game designers and the cinematics team must be in constant contact to make sure unity is established between the cinematic and the game itself. One of our biggest selling games here is SOCOM II. Its a Navy Seals game, and we were brought on to do four movies to depict the evil characters that the Navy Seals are going to take out in the course of the game. One of the challenges were faced with in doing a free-rendered cinematic is that it is a time consuming process and the game is being developed simultaneously, and theyre constantly changing things in the game. Say they want a character take Lara Croft, to make things easy have her hair swing around behind her head. Well, thats decided on very early in the course of developing the game. So the cinematic team would then take that model and start producing the movie with her with hair that moves as described. Then, if halfway through the production the game designers realize they cant get the technology to work right, they will remove the pony tail and make the hair a tied-up bun on the back of her head to solve the problem. And we have to go back in and re-render all of our scenes to accommodate for the changes. So the workflow back and forth from an information standpoint you really have to stay right with your team.
As a producer at Pixel Liberation Front, Sean Cushing has focused mainly on the creation of trailers for games such as Medal of Honor Frontline, Medal of Honor Allied Assault and Sim City 4, and an internal presentation at EA for Goldeneye 2. The march of technology toward faster computers has been enormously helpful, he says, referring to advancements in technology. More memory and better graphics cards allow us to do a lot of iterations faster so were able to do previs of the trailers really rapidly, get a cut approved, so we can focus on the rendering and the compositing to add the details that really separate the trailer from other cinematics and trailers that are out there.
Looking toward the future, Cushing sees the industry aiming toward several objectives. We want to think of new ways to incorporate cinematics into the gameplay experience, so its not just sit and watch and then continue playing. That was one of the most successful things about our work on Enter the Matrix how the cinematics and footage that they shot of the characters was incorporated. That was definitely interesting the convergence of the cinematics and gameplay. But what we want to do is be able to program in real time different story elements without having to make players stop in a certain position and watch or wait for the movie to load. This way we can create innovative and unique ways to incorporate story elements that enrich the experience without having to feel like youre watching a movie, because youre not youre playing a game.
The Game of Love
Cushing relates some startling developments at the recent Game Developers Conference in San Jose. There is a push to incorporate much more of a visual effects approach in terms of cinematics adding believable effects to the gaming experience. Effects in movies are hard, but to think of doing that in games in realtime is a really hard computational puzzle that I think will take years to achieve. But I know that people are looking for it. They want to see realistic explosions; they want to see real particle effects.
Most amazing are efforts to include emotional content in games. Its interesting to see how they are talking about story, and even love stories in games, and emotion. There was a session on creating an emotional impact called, Game Design Challenge, the Love Story: What Would Happen If Commercial Restraints Were Removed from the Game Design Process. Its hard to define exactly what this emotional content would be, because emotions change over time, and programming for that is what they are trying to address as part of the process.
Henry Turner is a writer and award-winning filmmaker, whose Lovecraft-inspired horror feature, Wilbur Whateley, won top awards at the Chicago International Film Festival. His writing on film has appeared in the Los Angeles Times, Lecran Fantastique, Variety and many other publications. A longtime film festival executive, he has programmed for the Slamdance Film Festival, and currently heads FilmTraffick L.A.