Search form

It Can’t Be Done? — Let Previs Do It

Christopher Harz looks at how previs is helping rid the f/x industry of the old adage it cant be done.


If previs and visual effects were around in Howard Hughes day, three people wouldnt have lost their lives filming this sequence in Hells Angels, which Martin Scorsese recreated for The Aviator. All images The Aviator images courtesy of Oliver Hotz. © 2004 Miramax Films.

Originally used as a crude general design tool or an animatic translator of storyboards, previs has become so comprehensive that it can lay out the precise motions of the actors and the camera moves down to the details of lenses and lighting. But previs has gone way beyond just a few key scenes, and now figures prominently in the look of the entire film. Small wonder that it has had a major impact on the movie industry, enabling major directors to both plunge into risky scenes and still stay within shooting schedules. Experienced previs supervisors remain in short supply, given the required mix of technical and artistic talent, and the need to mix it up with the other teams involved in a feature film is not a job for the reticent.

When it Works, Its Really Great

Previs works best if its well integrated with production, notes Sean Cushing, the exec producer of Pixel Liberation Front, which is on the forefront of the craft. If the previs team is standalone, it wont be very productive youll be spending the money, but not getting the creative results. The relationship with the director, production designer and art department is critical. The trend right now is that the relationship with the art department is both growing and getting more seamless, as they create detailed digital blueprints in tools such as Rhino for us to use.

Pixel Liberation Front, or PLF, was founded in Venice, California, in 1995, and pioneered much of the early work with previs, in films and music videos. Its credits include many of the most effects-laden features ever made, including The Matrix franchise, Van Helsing>, Pirates of the Caribbean, The Last Samurai, Minority Report, Elf, Godzilla, 2 Fast 2 Furious, Blade II and I, Robot. PLF recently worked on Zathura and The Brothers Grimm, and is currently prevising Superman Returns and Logans Run for director Bryan Singer.

Obviously some films are more challenging than others. Matrix Reloaded and Matrix Revolutions involved the most comprehensive technical planning Id ever seen, recalls Cushing. It worked because we had a lot of access to all the different department heads. There was lots of information sharing going on constantly. The previs work was very complicated, but it saved time, and the directors had to compromise less. It was a great tool to enable the storytelling by doing the sequences in previs first, the director was able to hone in on the exact shot he wanted.

Cushing uses SOFTIMAGE|XSI for his previs, and is a strong believer in using the hottest NVIDIA graphics cards available on his processors for precise rendering. He adds that he is creating previs for ever-larger segments of a film. Previs has evolved from a rudimentary short form for helping overall design to creating mini 3D movies, complete with storytelling, Cushing explains. Having gone through much of the film in previs with the director adds greatly to the cooperation and trust factor. In Van Helsing, Stephen Sommers was really open to creative ideas from us, and trusted us with implementing them, even during production. We would never have been able to try new approaches and stay on schedule if we hadnt walked through them in previs first, before the scene started shooting.

There is a lot of agreement on the part of previs experts on how important a close relationship is with all the different teams working on the set. In fact, visual effects supervisor Kim Libreri (The Matrix franchise) cautions that if the previs team and camera crew are not in synch, it can later lead to frustration and disappointment on the set, when they realize that the shots theyve prevised are impractical or worse, resulting in added vfx work in post. Thus, Ron Frankel (The Terminal, Minority Report, Panic Room), one of the real experts in this industry, refers to the previs supervisor as the nexus of the production, where all the many teams (from production designers on through to the post-production groups) get to interchange and integrate their input into a cohesive visual design.


However, since previs is still relatively new, it can take an amazing amount of optimism and self-confidence to travel where no man has gone before. An important decision we made early on in the project was to not get sidetracked by potential technical limitations. We would find the necessary skills when the time came, says Oliver Hotz, who worked with visual effects supervisor Rob Legato on The Aviator. I had a real apotheosis in Japan, Hotz says. I saw a motion capture team tell the creative team on a film what could not be done, and the creative team nodded their heads and accepted it. I was able to show them how to work around the problems, and became the creative teams hero. Since then my guiding rule is not to tell the creatives up front what they cannot do, but instead to focus on how to make their vision come to reality.

Hotz shot four major visual effects sequences for The Aviator, using Maya (a popular choice for previs), Alias | Kaydara MOCAP (for motion capture), LightWave (for modeling, lighting and rendering) and Digital Fusion (for compositing). A Maya plug-in named Beaver Project was used to move files between Maya and LightWave, and the Iridas FrameCycler was used to check each rendered frame for errors right away. The previs was used not only to plan for vfx shots, but also as a bidding template to get quotes from vendors such as vfx houses and model builders once actual production started. Hotz and Legato decided to previs every shot, not just the visual effects shots, which helped them, director Martin Scorsese and editor Thelma Schoonmaker get a much better feel for the flow of each sequence.

For The Aviator, extensive previs was used to help set up preplanned camera moves like the flight of the Spruce Goose.

For The Aviator, extensive previs was used to help set up preplanned camera moves like the flight of the Spruce Goose.

When the feedback loop for preplanned camera moves was too long to accommodate the demanding schedule, Hotz came up with a clever (and low cost) technical solution. Rather than plan each camera move himself and then get it reviewed and approved by those responsible for filming, he resorted to a type of roll your own solution, wherein the camera crew, DP and director could guide the virtual camera through the virtual set themselves. He did this by using a Libra pan-tilt-roll head interface to control the virtual camera in realtime, not unlike a videogame roll-pitch-yaw controller. This was connected to a Kaydara system, which captured the position of the virtual camera (it motion captured the camera, as opposed to motion capturing actors, as it normally does) and then rendered the appropriate camera view of the virtual set in realtime. The director or camera crew was thus able to try different moves through the 3D set (with its digital actors and objects), and when they found a move sequence they liked, that sequence (which had been digitally stored) was then replayed on the actual set. The realtime rendering of the precise camera moves (with choice of lenses) through the virtual set was accomplished on a laptop. What really made the difference was Kaydara and its very fast rendering engine, adds Hotz. You can see that Kaydaras roots trace back to game engine technology, resulting in realtime movement through a 3D set with what is almost a game controller. It was very precise we had very few compromises, and Martin Scorsese really appreciated the immediate feedback and control.

Many of the shots for The Aviator involved interaction between full-scale cockpits on a motion base (like the seats on a theme park ride) and a motion controlled camera rig. This proved to be very time consuming, since it often required a lot of trial and error to get the right look and motion. Hotz solved this problem by splitting up the previs animation into two parts (one to drive the motion base and the other to drive the motion control camera). The net effect was to maintain the moves set out in the previs.

Effective previs starts early in pre-production, right along with the script rewrites and storyboard creation. Before the previs, you should have a good storyboard, notes Dave Hare, co-founder of Tigar Hare Studios, which has worked on Electronic Arts 007 action game Golden Eye: Rogue Agent, as well as on Microsoft/Bungie Studios blockbuster Halo 2. You should make sure that everyone is agreed upon the 2D storyboards first this helps with the timing, the animatics, the design of the characters and so on. You may waste a lot of time creating specific 3D previs for clients who are not sure of what they want. Ive heard of clients with an attitude of: `Well know what we want when we see it. For that type of client, it may be better to stick with 2D previs until they agree on the vision and buy off on it. Its easy for a beginning previs team to generate an animatic with too much detail right up front in some cases, less is more.

Even games like Golden Eye: Rogue Agent are using previs as a way to test sequence before going into expensive production. © Tigar Hare Studios.

Even games like Golden Eye: Rogue Agent are using previs as a way to test sequence before going into expensive production. © Tigar Hare Studios.

Fortunately, with Golden Eye and Electronic Arts, we had a client who knew exactly what was wanted, and who did a great job at providing clear direction. Because of that we were able to create over nine minutes of game cinematics involving 137 [highly diverse] shots in just 10 weeks.

Realtime Previs (RTPV)

Previs is increasingly being used to plan for and interact with motion control cameras. One of the experts in this field is Engine Room, which creates previs that includes both virtual motion control before shooting and rendered data integrated with camera motion data during the actual shoot. The description of the paths that the camera will eventually take is so detailed that it even includes a precise model of the crane holding the camera (to help visualize complex moves that a camera head may make around an object while extended out from a crane) and models the dolly tracks that the crane may be moving on.

The primary measurement that describes the motion of the camera is Kuper data, named after Kuper Controls of Albuquerque, New Mexico, which has been a dominant force in camera control hardware and software since the 1980s, when motion control was first used on films such as Big Business. Last year, Kuper founder Bill Tondreau was awarded an Oscar for his motion-control wizardry, used to great effect, for instance, in The Lord of the Rings trilogy. The system that he invented records the slightest motion of a camera on an initial shot in three-dimensional (x,y,z) space, so that subsequent shots controlled by the system can repeat those precise movements.

There are limits to what motion control can do, suggests Dan Schmit, founder of Engine Room. By using previs to do a virtual run-through first, you can see ahead of time if you break the rules, and make appropriate adjustments. Schmit used RTPV on Sky Captain and the World of Tomorrow, where the location and time of each frame shot by the camera were precisely captured and then married with the Maya files describing the 3D virtual elements (characters, objects and backgrounds). The composited and rendered scene can be shown in almost exact realtime, he notes. Its actually only three frames (about 1/8 second) behind. In fact, its so fast that we put monitors in the eyeline of the actors, so they could react better to virtual characters on the set. You can be looking at the actors on a bluescreen, and flip a switch and see them on a 3D set with CG characters.


With visual effects extravaganzas like Sky Captain and the World of Tomorrow getting more and more complex, previs is becoming a must in determining if a shot can or cannot be accomplished. Images courtesy of Engine Room. & © 2004 Paramount Pictures. All rights reserved.

However, Schmit cautions against using RTPV in every case. Motion control is really cool, he says. But it costs money, so you have to weigh the value you get out of it if theres no CG in the scene, or interaction with virtual characters, you may want to save the extra costs. He also warns of depending on previs too completely. Something will always come up that you hadnt planned for when you created the previs. It can be the weather, or perhaps someone mis-measured the set or perhaps the performance of one of the actors doesnt come through. You should always be ready to adapt to local circumstances, and be ready to make necessary changes.

Previs Pros and Cons

Visual effects pioneer Doug Trumbull (2001: A Space Odyssey, Close Encounters of the Third Kind, Star Trek: The Motion Picture and Silent Running) agrees that you have to use caution with previs: The previs can be really useless if you run into something totally unexpected, he says. Like the Back to the Future ride at Universal Studios. We had planned that in great detail, but when we tried it out it didnt look right, and we had to quickly create a small working model of the ride with an overhead dome. Much of the camera work turned out to be counter-intuitive, in conjunction with the motion platform for instance, to give the feeling of acceleration, we wound up tilting the seat back, and the camera up to level the horizon.

Still, Trumbull is a fan of realtime previs. For Disneys Book of Pooh, we shot all 52 episodes with puppets on virtual sets, all composited in realtime. The finished result was in the viewfinder. With only a three-frame lag, it was possible to make much higher quality decisions about each scene. He also believes that previs should not be limited to using digital charaters. Sometimes stand-ins or stunt doubles make more sense than CG 3D characters. In certain situations theres a limit to what you can learn when you use CG characters to block out camera moves. In Close Encounters of the Third Kind, for instance, the lighting was an important element, which was easier to understand with real characters.


The Future of Previs

One clear trend for previs is that directors now expect more detail and faster turnaround times for changes or what-if explorations, which demand either some good computers on location or a fat pipeline to a render farm, so that scenes can be rendered and presented more quickly.

And look for more previs integration with motion control cameras: for planning the specific motion of cameras and cranes before the shoot, then for combining and compositing real and virtual elements during the shoot for immediate playback to the director and finally for tight integration with post- production effects work. Since motion control is still very labor intensive, major improvements in tracking-related hardware and software should lead to quantum gains in this area.

Finally, we can expect previs to be used more and more by producers and others as a sales tool to get projects greenlit. For example, Vfx supervisor Hoyt Yeatman recently prevised a mock trailer that helped land his first directing gig on G-Force. Presentation details in a previs, withtextures and lighting as well as aural content to `punch up certain scenes, are invaluable, notes PLFs Cushing. This demand for whole segments of a film long before it is shot will also reinforce another trend: that previs is increasingly being used for whole sequences and even for the entire film.

Christopher Harz is an executive consultant for new media. He has produced video games for films such as Spawn, The Fifth Element, Titanic and Lost in Space. As Perceptronics svp of program development, Harz helped build the first massively multiplayer online game worlds, including the $240 million 3-D SIMNET. He worked on C3I, combat robots and war gaming at the RAND Corp., the military think tank.