Eric Post takes us on a ride through some of the summer tools that are setting new standards for vfx excellence.
Summer is here and obviously so are the vfx-intensive movies, which rely heavily on a diverse arsenal of tools to get the job done. There are some new developments, such as Luxology's modo now being used at Pixar, which helped power up WALL•E and the Axiom. Massive is becoming more and more ubiquitous, of course, as are Fusion and Nuke on the compositing side. But here is just a glimpse of how a few of these tools are being utilized in fresh and challenging ways.
Speed Racer and Global Illumination
Speed Racer may have gone down in flames at the box office, but it arguably remains the boldest experiment of the season. Haarm-Pieter (HP) Duiker, Digital Domain's CG supervisor on the project, talked about the software used to create the "hyper realism" effect for the cars and the racing environments. Digital Domain used about 50 lighters and about 50 compositors during Speed Racer. In all, about 200 people worked on the making of the next-gen feature by the Wachowskis. Maya and Houdini were the two main packages used. Compositing was done in Nuke (which DD turned over to The Foundry). But the star of the software lineup was, of course, mental ray from mental images (now owned by NVIDIA).
Some of the data used to start the Speed Racer project included parking a Corvette in a parking lot and shooting the car and gray balls from multiple angles to thoroughly capture the lighting conditions. After taking raw data from the parking lot to the studio, Duiker's team began to draw on some of the lighting and rendering techniques and global illumination used in The Matrix sequels. With advanced photogrammetry and commercial tools, the data was used with mental ray to begin the lighting of the car models.
A real challenge for Speed Racer was to combine the reflections and diffuse sides of the cars in a way to make it believable. Along with this was the added challenge of working with football stadium-sized sets for the races to take place and integrating more than 100 cars.
Duiker said mental ray was the renderer of choice because it had a very open API. The artist can build their own libraries. "Digital Domain built an entire suite of shaders from the ground up to what we consider to be the most advanced global illuminating tools today."
One goal for Speed Racer was to take the realism and cars to a hyper-realism. Ray tracing was used for secondary illumination and diffusion. This process combined with pushing the colors to a hyper-realism visual effect is what made Speed Racer a truly unique work of art.
FrameCycler Goes on a Stereoscopic Journey In 2003, Iridas was approached by several movie studios for a stereoscopic version of its renowned FrameCycler. In 2007, the stereo features became a standard part of the software with an unlock code for those who wanted to buy the feature. Without a stereo review, you would have to watch your sequence with the left eye, then the right eye and try to match everything up. Is the parallax right for the object being in front of the bush or does it look like the object is behind the bush? Stereographic FrameCycler identifies these issues right away.
Color grade is also important for near and distant objects. The Dolby projection system can color grade and FrameCycler works with Dolby. So those minute right eye / left eye changes that we take for granted in the real world can be done easily in FrameCycler.
Jump to Journey to the Center of the Earth 3D (opening July 11 from New Line). It is one of the summer movies that relied heavily on FrameCycler where there were mixes of 3-D and composite image.
Vfx work for Journey to the Center of the Earth was done in Montreal facilities, all of whom used FrameCycler with DualStream for artist review and digital dailies.
Meteor Studios, which closed down earlier this year, ran Iridas' review application on Linux machines."All of our artists used FrameCycler Professional for stereo review," says Francis Provencher, who is now technology supervisor at a new studio, LumiereFX in Montreal. "Our compositors and trackers used it with stereo goggles. Trackers 'placed the cameras' and tracked camera movement so that the CG content was properly positioned in the shots."
"Doing CGI in stereo is easy, but when you add the plates, it gets tricky," added Marc Rousseau, vfx supervisor at Mokko Studio, which runs FrameCycler on Windows. "It is a complicated process making the plates match. Our compositors couldn't use the same 'cheats that they have with single stream content. It's like they had to re-learn how to composite.
"Obviously, we couldn't have done this without DualStream. It works great, you load the right eye in FrameCycler and the left eye follows automatically. FrameCycler with DualStream is a powerful tool and we found we could catch 90% of stereo issues right there on the small screen."
Rhythm & Hues and Sonic Cannons
Nathan Ortiz is one of the vfx supervisors at Rhythm & Hues that worked on Universal's The Incredible Hulk and The Mummy: Tomb of the Dragon Emperor (opening Aug. 1). Houdini along with in-house plug-ins are the big tools for those movies. Ahab is a proprietary plug-in that does fluid simulations and Cloud Tank is the proprietary volumetric renderer. Felt is a custom volumetric modeling tool.
With this arsenal, R&H went about doing what they like to do -- destruction. Scenes such as tearing the police car in half and using the halves for gloves or blowing up tanks are the kind of visual effects R&H excels at.
Ortiz and co. used a lot of aspects of Houdini and especially the DOPS, Dynamic Operators (or rigid body dynamics), to create the scene where the Hulk is pinned by the sonic cannon. The goal of the sonic cannon was create the effect of an invisible force that could blow things up and also pin the Hulk to the ground. The hope was to make the scene so realistic as to not be able to tell the difference between the real world and the CG world.
To make a scene look real, the team started with real world physics. Calculations were made to determine how much force it would take to move the nine-foot-tall Hulk and pin him. This same force turned out to be enough to cause clothing to tear and even skin to deform and tear. Marvel Comics, of course, wants the Hulk to be stronger than that, so adjustments were made to make the Hulk able to withstand that kind of real world physics.
Houdini did not need any real tweaking. Its particle system was sufficient to make the effect of the sonic force look like an invisible and dangerous weapon. A custom operator node in Houdini was used for the timing of the collisions to make the sonic cannon very accurate on impact. VEX operators (Vector Expression) were used. When the impact occurred a number of events had to simultaneously occur as well. The clothing had to tear, dirt and dust blasted around, grass movements occurred, paint flecks flew and all the detail that goes with an explosion /impact had to work at precisely the right time. One of the team members said that the sonic cannon "looks like being hit with a thousand leaf blowers."
Marvel was very conscious about how much of the movie was real world and how much was Hulk's world, meaning that the laws of physics were different in Hulk's world. This explained why the force of the sonic canon did not vaporize his skin and still gave the illusion of a strong impact.
The majority of Hulk was rendered in the proprietary Wren renderers. The sonic cannon was a combination of Wren and CloudTank. Some of the Mantra renderer was used as well. ZBrush and Mudbox also found their way into the movie.
Wren has proprietary subsurface skin textures and shaders. R&H has done a lot of development in that area.
Now that Ortiz and his team are confident that they can destroy vehicles realistically, the next horizon is to scale it up and keep pushing the amount of pyrotechnics and levels of destruction.
Summer Fun for Autodesk
Sylwan pointed out that Speed Racer has continued a stylistic trend in next-gen movies that we are likely to see more of in the future: a "hyper-reality" that pushes visual limits rather than big explosions and other CG mayhem.
Meanwhile, Star Wars:The Clone Wars is relying heavily on an Autodesk pipeline to blast off into fresh animated territory. The 3D-animated series from Lucasfilm Animation launches as a feature on Aug. 15 from Warner Bros. before debuting in the fall as a series on Cartoon Network.
Not surprisingly, Clone Wars is all Maya, plus some other Autodesk tools have been utilized as part of the robust studio pipeline.
In conclusion, the tools for the 2008 summer season are advancing rapidly in their ability to create realistic and lifelike vfx, promising new realities and hyper-realities in the world of cinema.
Eric Post is an attorney, journalist, computer graphic artist, helicopter pilot/mechanic and a former pastor. Although he is a traditional artist, he enjoys modeling and landscape scenes in CG and uses various applications for medical illustrations at his office. From 2004-2006, Post was senior technical editor for the Renderosity magazine and e-zine. From 2006-2007, Post served as a staff writer for the Renderosity Front Page News, as well as having edited for various Renderosity publications.