Craig Hayes tells us what it was like working in China on Red Cliff, which benefitted greatly from contributions by Prime Focus and CafeFX.
John Woo's epic Red Cliff (now playing from Magnolia Pictures), was made in China and based on the Battle of Red Cliff, which took place during the Three Kingdoms period in ancient China, during the final days of the Han Dynasty.
Craig Hayes, overall visual effects supervisor, formerly with the now defunct Orphanage, discusses overseeing the 856 shots on a $10 million-plus budget, and work from other vendors such as Prime Focus, CafeFX, Digital Dimension, Hatch, Kerner Optical, Pixel Magic, Tippett Studio, Anibrain, Xing-Xing and Crystal CG.
"We [The Orphange] were the lead vendor and we knew the workload was probably more than we could handle, so we functioned as the visual effects production wing of the overall production. In addition to having artists working on shots, we were managing all the other vendors, and also we had our own editorial department in-house, so we were able to kind of run interference with the production, which had around four editors working at any given time. So keeping track of what were visual effects and what weren't was a daily task as this post-production management role."
Getting adjusted to the way they make films in China was quite an adjustment, according to Hayes. "It was an interesting thing because I had no experience with Asian filmmaking per se, and so when I showed up there, there was no big board or AD, so I asked for the big board so we could start tackling the big problems and figure out how we were going to achieve this stuff. There was about a week of that and we went through every storyboard that we had and [figuring out who was doing what] and everyone being hyper sensitive about their budget or lack thereof. We left China not knowing where we stood, but we came back to Los Angeles and found out we were hired a while later."
So Hayes and his Orphanage staff dived into production and broke down the key action and vfx sequences, prevised them and then strategized the best approach. "It turns out that the previs was very helpful because I don't speak Mandarin and at the end of the day it didn't serve as a template but helped with layout and other purposes. From there we moved to Beijing and began the process of finding a local data wrangler and figuring out how we were going to work with the crew, which was 700 people and on top of that we would have 1,000 soldiers from The People's Liberation Army and they would be responsible for feeding, housing and clothing. It's like having a miniature city there and was unique to me. At The Orphanage, I had a small crew of primarily comp, so we were able to push through 200 shots for [the first part] and I'm particularly proud of some of the battle stuff we pulled off. We actually had some vendors working in China: Xing-Xing and Crystal CG, which was culturally important."
"CafeFX and I were tasked with some epic shots of General Cao's fleet mobilizing and traveling down the Yangtze River to Red Cliff," explains Kevin Rafferty, visual effects supervisor for CafeFX. "The production had built a full-scale replica of the general's flagship, as well as a few other ships of various sizes. The plates that were provided to us usually contained at least one of these ships for lighting and scale reference.
"When these plates were shot, the river's water level was quite low, and the surrounding terrain was not as visually compelling as John Woo would have liked. In fact, many of the plates focusing on General Cao's ship sailing down the river were actually shot with that ship tied to the dock.
"As a result, our team had multiple herculean tasks ahead of them to achieve these epic shots:
- A fleet of hundreds (sometimes thousands) of various sized ships needed to be generated on the computer.
- These ships needed to have crews generated, animation cycled created, and crowds simulated.
- The crews needed to have variation in size, uniform, and weaponry
- The ships flags and sails needed cloth simulation.
- The environment needed to be replaced.
- The terrain replaced with 2.5D matte paintings.
- The river water needed to be simulated for both river current and interaction with the ships and their oars.
"On many shots, General Cao's ship was the only thing in the shot that was live-action. The rest was computer-generated."
CafeFX was asked to take on some additional epic shots that involved a land battle, with cavalry and foot soldiers. "The Allied Forces created a trap for General Cao's army," Rafferty continues. "They called it the Tortoise Shell Formation. The foot soldiers created a maze that Cao's army entered. Once inside, the Allied forces closed the 'doors' of the maze, trapping Cao's army. Having the upper hand in the battle, the Allied Forces won this battle.
The plates were shot with a small portion of actual cavalry and foot soldiers. Our task was to complete the Allied formation and attacking Cao army.
The tasks involved in bringing these shots to life include:
- Generating an inventory of horses and soldiers with multiple types of uniforms, weapons and colors.
- Creating animation cycles for crowd simulation.
- Hero animation for key elements per shot.
- Cloth simulation for flags and clothing.
- Dust simulation for foot falls and hoof falls.
- Overall dust cloud simulation.
- 2.5 D matte painting for set extension/environment creation on wide and aerial shots.
As for the pipeline, it included Maya for modeling and animation; ZBrush for sculpting and textures; crowd simulation in Behavior; water simulation in RealFlow and Houdini; dust simulation in Houdini; rendering in mental ray; and compositing in Nuke and Fusion.
For the final naval battle, Prime Focus, under the supervision of Jason Crosby, was tasked with creating the entire environment surrounding a massively destructive attack waged on an enormous fleet of ships -- all within a very truncated eight-week production schedule.
"With shots that called for a fleet of 2,500 of the same 26-meter boats, giving each of the boats a unique 'hand-crafted' feel was both a creative and technical challenge," Crosby explains. "We really wanted to avoid the repetitive look that comes from using the same CG model. To do this, we broke the boats down into components that were randomly assembled using a rule-based particle system called Thinking Particles, which also propagated and animated the fleet."
Static components such as crow's nests and masts were modeled and textured with a few variations that the particle system would choose from and then add scale variation in x, y and z during assembly, so no two components were the exact same shape. This gave the artists a lot of variation in each boat while keeping with the general design. This system also allowed for the combining of both animated and static components on each boat, and having them react to each other. For example, Thinking Particles would vary the rocking of the boats, which would affect the swaying of the beam and sail while keeping them attached to the mast. Using a particle system made it easy to randomize and control the large number of objects.
In addition, the entire CG sequence of 2,500 boats was populated with soldiers -- roughly 70,000 in all -- who needed to be seen performing on the ship. The scene also featured fire and smoke elements, CG explosions and boat damage from impact.
Prime Focus also created a Massive pipeline specifically for the film. The artists were able to integrate and utilize the AI-driven crowd simulation software in a unique way that speeded up the production process without sacrificing quality. For the characters, Prime Focus had about five weeks to set up a Massive pipeline and get the 70,000 soldiers animated over 2,500 boats. To do this, the team used Massive to animate various boat crews. Thinking Particles was then used to modify each crew's animation and then propagate these crews throughout the fleet.
CG models were imported from Maya into 3ds Max, in which the bulk of the 3D work was done. Compositing was done with Fusion. "With so many elements shadowing, reflecting and overlapping each other, it made breaking the renders down into smaller chunks difficult," Crosby suggests. "Using conventional instanced geometry wasn't a good option because we wanted each boat and crew to look somewhat unique. Cebas Final Render allowed us to vary scale, speed, start frames, textures and other things on instanced geometry. Thinking Particles was used to semi-randomly pick from these instances to build each unique boat. This allowed us to render everything except the water in single passes without exceeding the RAM limit, which saved tremendous amount of time and allowed us to assemble and update scenes very quickly."
Bill Desowitz is senior editor of AWN & VFXWorld.