Search form

'Flyboys': Motion Capturing CG Planes in Flight

Flying high with Alain Bielik and the Double Negative crew on Flyboys. View a five-minute dogfight sequence in our New VFXWorld Media Player.

fb01_Flyboys.jpg

Double Negative created the exciting dogfight sequences in Flyboys, which depict the first American fighter-pilot squadron to confront the German aviation during World War I. All images © 2006 Electric Holdings (Flyboys) Inc.

There used to be a time when fighter pilots didnt shoot enemy planes from miles away. In the early days of aerial dogfights, during World War I, they had to get so close to their foes that they could actually see their eyes. Plus, there was a real sense of chivalry between pilots. You didnt shoot an enemy who was handicapped by a faulty weapon or engine. It was just not fair. Flyboys (opening Sept. 22 from MGM) takes us back to those days when good manners were still the norm. The movie tells the story of the Lafayette Escadrille, the first American fighter-pilot squadron to confront the German aviation during World War I.

Needless to say, it was out of question to try to film dogfights with vintage planes that were almost a century old. To create the exciting confrontations, director Tony Bill turned to visual effects supervisor Peter Chiang and vfx producer Fay McConkey of Double Negative in London. We did more than 740 shots in total, Chiang recalls. We started by previsualizing the battle sequences there were six of them. At the same time, we adapted our pipeline to the fact that Flyboys was shot using the new Genesis camera from Panavision, a high-resolution 1920x1080, full 10-bit log color depth digital camera, designed to take Panavision film camera lenses. All our images to date have been originated on film. So, the new camera system meant that we needed to create a completely new pipeline. First, the Sony digital tapes were input through a Sony SRW1 tape deck retaining the digital quality that was captured. Second, a new look up table was needed to view the images the correct way. Response curves from Panavision were given to us so we could work out the correct viewing system. The rest of the pipeline was also adapted to the new format.

Motion Capturing a Plane

To create the shots, Double Negative used a Linux- and Mac-based pipeline that included Shake, Photoshop, and Maya 3D with RenderMan. Final Cut Pro was used for editorial, and Baselight was the tool of choice for color grading.

Chiang and his team found out that their main technical challenge was developing a procedural system for the animation of the planes. Their solution was to create the first motion capture system for a plane in flight. We encoded a Jungmann that was flown by Nigel Lamb during a three hour session. Lyndon Yorke of Aerofilm, and Helmut Kohlhaas of IGI systems, worked with Panavisions Peter Swarbrick to lock the digital camera to the encoded data through an Inertia Measurement Unit (IMU) and GPS system. Fitted to the plane, the IMU recorded the exact position of the aircraft 128 times a second. The unit was synced with the GPS system linked to the world time clock. The Genesis camera was also linked to a GPS system that was time stamping the world clock onto the digital tape. This allowed us to sync the data to the pictures we were filming. The pitch, yaw and rotation of the plane were all read by the IMU, and then translated, with proprietary code written by Oliver James, to the CG planes. This meant that our CG planes had all the flight characteristics of a real plane. We compiled a library of various maneuvers and applied the right data when it was needed. We were able to study the translation curves and build up procedural animation for the more generic flight characteristics. Animation supervisor Mattias Lindahl headed up the previs and animation team.

Even though the system provided highly realistic maneuvers, animators often embellished the action to heighten the drama that was taking place. In order to keep the animation grounded in reality, senior R&D developer Jeff Clifford wrote a piece of software that alerted the animators when they were exceeding the flight capabilities of any given aircraft.

CG planes were built in Maya. For the destruction scenes, rigid body dynamics, with cloth dynamics for the fragile wing surfaces were used. For air to air collisions, 1/4th scale models were built that were flown on wires and detonated.

Building and Destroying Vintage Airplanes

The CG planes were built in Maya by a team headed by co-CG supervisor Rick Leary. Since all of the planes were available as full size replicas, the team could photograph and measure them in extreme detail. High Dynamic Range Imaging (HDRI) maps were taken for all lighting environments, and Bidirectional Reflectance Distribution Function (BRDF) was captured for all surfaces. This technique involved obtaining a piece of the doped fabric from a plane, wrapping it around a cylinder and capturing its reflectance properties using calibrated flash photography from various angles. These images were then run through proprietary software that output data to surface shaders.

For the destruction scenes, we used rigid body dynamics, with cloth dynamics for the fragile wing surfaces, Chiang explains. But for the closer air to air collisions, Mike Joyces Cinema Production Services built several 1/4th scale models that were flown on wires and detonated. This gave us an instant random destruction for a good price. The Black Falcon crashing into the ground also was achieved this way. In a few shots, the models were merged with CG planes. For the Zeppelin demise, CPS made intricate brass girder work in 1/20th scale, based on the original construction drawings of the airship. The model was hung on wires and detonated while four cameras covered the explosion, shooting at 120 frames per second. A separate quarter scale section of Zeppelin surface was built to serve as a background for the escaping gunner. The stuntman was shot running through a series of flashing orange and yellow lights before leaping to a crash mat. For the digital version of the Zeppelin, we took our texture reference from the scale model, which gave us continuity and consistency. CPS also created the ammunition dump in 1/8th scale a huge 40' table top model. At render stage, global illumination was used on all models. Tighter shots on the pilots were filmed with actors sitting in replicas mounted on gimbals in front of a greenscreen.

A key aspect to Flyboys was the creation of the 360° environments in which the action would take place. The vfx team had to develop the different persepectives pilots would see from for various altitudes.

360° Environments

Parallel to the airplanes and their animation, the other key aspect to the project was the creation of the 360° environments in which the action would take place. Co-CG supervisor Alex Wuttke oversaw the effort. Chiang and his team first filmed a series of test at various altitudes to get a perspective of what one would actually see from the air. We realized that from below 3,000' (1,000 meters), we would need to see parallax in the trees. We decided that below 1,000' (350 meters), we would shoot plates of the real environment. For between 2,000'-3,000', we developed a procedural way to populate the landscape with CG trees. The vegetation was created by a proprietary tool called Sprouts: volumetric sprites that are able to render millions of trees. R&D developers Oliver James and Ian Masters wrote it.

For the terrain itself, we procured a dataset of high resolution aerial photographs covering a 10 square mile area of the U.K. We chose a very rural area, very pastoral with very little modern architecture and roads. Any anachronistic element was painted out. In addition to the maps, we were supplied with digital terrain models to match. It was important to purchase maps that were shot in flat light, so that we could light them as required. The goal was to build a global environment that extended to infinity and in which we had the flexibility to move the sun wherever we needed.

Introducing Tecto

The R&D team was soon asked to develop a way to efficiently manage and render large amounts of landscape data. The result was Tecto, the brainchild of senior R&D developer Jonathan Stroud. Tecto is the name of a suite of tools and plug-ins for Maya and PhotoRealistic RenderMan, Stroud says. The pipeline starts by importing digital elevation data and aerial photographs into the Tecto database. A standalone Tecto application using wxWidgets was developed to view and manage this database. Most elevation data can be read in from a number of simple ASCII formats. The image data normally comes as a set of TIF images and files that describe the location and scale of the data.

Tecto splits the landscape data into tiles of a regular world size, e.g., 1km square. Elevation data is turned into triangle meshes, and image data is stored as EXR images. A background process scans the images for any change, and generates tiled and MIP mapped versions suitable for rendering. The database can store multiple layers of data, the landscape mesh and the aerial photo textures being two examples. Each layer can have its own resolution. Typically, elevation data had one sample per 10 meters (30') and image data was 25cm (10") per pixel. Once this database is under Tectos control, new image layers can be added that can be referenced in shaders or procedurals to generate things like trees and hedges or to control shading properties.

Vfx supervisor Peter Chiang (left) was excited with Panavisions Genesis camera and the data capture of a planes flight characteristics. Jonathan Stroud developed Tecto, which managed and rendered large amounts of landscape data.

The Tecto application allowed artists to select rectangular regions of the database and export them to images for modification in an application such as Photoshop. Mesh regions could also be exported for editing in Maya, and re-sampled to whatever resolution the artist required, allowing them to see a low-res proxy of the real landscape. Low-res versions of the landscape textures were applied through normal Maya shading nodes.

When launching a render, our proprietary PRMan renderer interface and pass management system Rex looks at the information defined on the Tecto nodes in the Maya scene, and generates a PRMan procedural for each tile in the output rib, Stroud continues. At render time, this procedural looks into the Tecto database and generates the necessary mesh data for the tile. A custom PRMan shade-op was written to take world space positions and evaluate a single texture layer from the database. The shade-op is the equivalent of a texture call for EXR files in the Tecto database. A custom PRMan shade-op was written to extract any texture layer from the Tecto database, allowing the surface shaders full access to the database when rendering the landscape.

In the final shots, the main landscape was surrounded by a 360° environment, including infinity horizon, painted by digital matte painter Diccon Alexander.

Cloudy Sky Ahead

Once the global environment was completed, Wuttke and his team built CG clouds to populate it. Clouds were deemed of paramount importance to help the audience maintain a sense of up and down, and a sense of speed too. They were thus treated as static anchors in the shots. R&D developer Ian Masters created a tool, dnCloud, that was used to model 3D clouds in Mayas Viewport, Chiang says. It employed implicit spheres and noise functions to generate exactly the types of clouds that were needed. R&D also created custom volumetric shaders to make the clouds react to lighting in a believable way, incorporating such effects as multiple forward scattering and self-shadowing. Our proprietary voxel renderer DNB (originally developed by Jeff Clifford for Batman Begins) was used to render them. HDRI maps were obtained from high points in England to light the shots. Once set up, the clouds could be pulled into position within the master battle arenas and rendered on a shot by shot basis.

All the elements planes, environment, clouds were finally combined by a team of compositors headed by 2D supervisors Charlie Noble and Jody Johnson.

CG clouds were built to populate the global environment. Clouds were an important element in helping the audience maintain a sense of up and down, and a sense of speed. 

Old and New

For Chiang, using high end technology to recreate good old vintage airplanes wasnt the most intriguing aspect of Flyboys: We used a lot of traditional techniques mixed in with the new, and these were all seamlessly blended together. At the animatic stage, producer Dean Devlin and Tony Bill welcomed our input to shape the battle sequences. We would assemble the shots in Final Cut Pro, and dub music and sound effects to a sequence for presentation. This allowed us to explore camera angles and shots for best effect. Most of all, I think we had two features that were unique to our work: the first use of Panavisions Genesis camera, and secondly, the data capture of a planes flight characteristics employed for all animation. It was the first time any one had ever been able to motion capture a plane in flight and gather real data.

Alain Bielik is the founder and editor of renowned effects magazine S.F.X, published in France since 1991. He also contributes to various French publications and occasionally to Cinefex. Last year, he organized a major special effects exhibition at the Musée International de la Miniature in Lyon, France.