Search form

'Journey to the Center of the Earth': 3-D Coming at Ya!

J. Paul Peszko tells why it was an extremely challenging Journey to the Center of the Earth for the vfx studios.

journey01_JourneyCenterEa.jpg

Journey to the Center of the Earth offers a plethora of stereoscopic vfx, including virtual environments and fully CG characters. All images © MMVII New Line Productions, INC. and Walden Media, LLC.

For Eric Brevig's remake of the Jules Verne classic, Journey to the Center of the Earth (opening today from New Line), not only is it chock full of vfx but it also represents the first big live-action test case for 3-D.

There are virtual environments, fully CG creatures, atmospheric effects, water simulations, set extensions, digital doubles and complex live-action integration. As for the stereoscopic impact, for those fortunate enough to catch it in digital 3-D, the vfx team found it extremely challenging.

No less than five visual effects firms worked on the production. All five -- Meteor Studios (now defunct), Hybride (now sold to Ubisoft), Frantic Films, Mokko and Rodeo FX -- did CG environments, complex simulation work and stereo compositing. In addition, Meteor did CG creatures (the dinosaur). Hybride also did flowing water (river) and lava creatures (carnivorous plants, glowing bird and dandelions). Frantic Films did water (ocean) and lava creatures (flying fish, water dinosaur).

Coordinating their efforts and overseeing all the visual effects work on the film was a monumental task, one handled quite capably by Chris Townsend. Working hand-in-hand with Brevig (the vfx supervisor turned director), was Chris Townsend, the overall visual effects supervisor, to ensure that all the effects synced up in a stereo format.

"Everything had to work in stereo," Townsend stated. "All the tricks we are so used to using in the world of feature film visual effects in a mono world, had to be reconsidered."

That proved to be, as one might expect, the most challenging aspect of the production.

"In 2-D, if you need to show that an object is a hundred feet away, elevating the black levels, reducing its scale, maybe blurring a little to imply depth cueing, works. These are the tricks we know. Working in stereo -- all that goes out the window. In stereo the object literally has to be placed a hundred feet away, in the virtual world. That means you have to create a CG 3-D environment, which is created to scale -- for every shot. There is no cheating; the rulebook has changed. Those tricks, those sleights of hand, with which we are so familiar, they disappear. If it doesn't work in stereo, it doesn't work. That was the biggest challenge: learning that the way we work has changed."

Everything from composition of a shot within the third dimension to integrating live-action elements so that they seamlessly fit within a scene had to work not only from a 2-D aesthetic point of view but also from a stereo perspective.

"Often we worked with sub pixel accuracy to ensure that a splash happened upon the surface of an ocean rather than floating above, or a bluescreen foot walked upon a CG plane rather than under it. The level of detail required to ensure that the audience wouldn't think, 'Hey, there's something wrong about that image,' was intense. We look at the world in stereo, that's our life. Creating a film in stereo takes us one step closer to that reality that we know so well. You don't need to be a visual effects expert to tell if something is wrong in stereo. You just need to have stereo vision! However, understanding the problem and knowing how to solve it does require a different level of expertise."

So how did Townsend and the rest of the visual effects team solve the problem?

All of the vfx sequences were prevised by Persistence of Vision, including some in stereo.

"As we photographed the entire film in stereo, using specially designed rigs holding two cameras, ensuring that the two 'eyes' matched was another challenge. Lenses are physical things which, even though they are built to pretty tight tolerances, are never identical. The two images created can be off in scale, rotation, vertically misaligned, have a focus or depth of field mismatch, have a different exposure. All these things need to be adjusted so that the images are normalized with the only variation between the two being a horizontal offset, like your eyes. Otherwise, viewers will have a hard time resolving the two images. Even if they can, eyestrain and headaches will ensue. Not good for a full-length feature film!"

All of the vfx sequences were prevised by Persistence of Vision. "One sequence, the floating rocks, was also prevised in stereo, which allowed us to test out some of our thoughts on the interocular distances (how far apart the lenses should be) and convergence plains. We did an early test of an actress in a cave environment, shooting with the stereo cameras, to put everything through a dry run, to test out stereo pipelines in some of the facilities and to do some (image) development in stereo."

As for the software being used on the production, since there were multiple vfx firms involved, each used their own platforms.

"Each had to be enhanced to create compatibility with stereo. Some were tuned with proprietary GUIs to try to simplify working with two corresponding eyes. For others it was just a matter of designing a stereo workflow.

Frantic Films VFX's main contribution was the CG ocean for the raft storm sequence, which included rain, fully articulated creatures, water surface simulation, water spray dynamics and blowing mist. Courtesy of Frantic Films.

"On the set we used hollow cubes and grided markers to help the camera tracking in post, but rig and lens metadata was also encoded into the DPX frames. This information was then used by most vendors to further simplify the camera tracking process."

Townsend also credits FrameCycler as a viewing tool of major importance.

"Being able to view shots in stereo was key. Initially I reviewed work on a monitor, using Iridas' FrameCycler using active shutter glasses. The ability to analyze full resolution shots in realtime, in stereo, in my own time, without the constraints of having to book a theater, was incredibly important. Discerning what is right and wrong about a stereo image is complex. There are so many aspects, which can cause an image to be wrong. Is an element within the image flipped, misaligned, out of synch, in the wrong stereo space? Are the left and right eyes of different exposure? Are there photographic anomalies (lens flares, blur, dust, etc) that appear in one eye but not the other? All these things have to be studied in order to move the shot along, from a stereo point of view. And that doesn't even take into account the aesthetics of the shot itself; that was a whole other ball game."

Another viewing tool that Townsend used was Acuity's QuVis software.

"Having viewed the work on a monitor, I then reviewed the work on a 23-foot screen, using dual projectors, passive circular polarized glasses and Acuity's QuVis software. This allowed me to look at the work from an audience's perspective and to examine the stereo space more accurately."

Townsend admits that working in stereo is working in a whole other dimension. "It relies on techniques that we, as an industry, are only just learning, but that future promises to be richer and far more immersive, from an audience's point of view."

Frantic Films VFX, a division of Prime Focus Group, served as a lead visual effects provider on the film. The filmmakers came to Frantic because of the studio's expertise in creating believable digital water effects using its proprietary fluid simulation suite, Flood.

Frantic also created three digital characters end-to-end -- the Razorfish (above), Plesiosaur and Trilobite -- using a flexible character pipeline using 3ds Max. Courtesy of Frantic Films.

Frantic also created three digital characters end-to-end -- the Razorfish, Plesiosaur and Trilobite -- developing a flexible character pipeline using Autodesk 3ds Max. Custom plug-ins were scripted to manage data interchange from rigging to animation, modeling and lighting. This non-linear approach provided a more practical workflow. In the event that changes were called for, the team didn't have to halt the entire production pipeline.

The Frantic team was led by Vancouver Visual Effects Supervisor Chris Harvey, Winnipeg Visual Effects Supervisor Mike Shand, Visual Effects Producer Randal Shore and Head of Software Mark Wiebe.

"Our main contribution was the CG ocean for the raft storm sequence, which included rain and fully articulated creatures," Shand said. "There were vast amounts of water surface simulation, water spray dynamics and blowing mist. We also did many set extensions for the various beach sequences. In addition, we created vfx for the opening nightmare sequence of the film, including the giant trilobite and collapsing cave environment."

Frantic completed five sequences for the film. Of these, the largest and most important was the 4-1/2-minute raft storm sequence, of which there were 123 shots total. All shots featured a live-action raft and actors in a completely CG environment, including a 100% CG subterranean ocean.

In that sequence there were upward of 100 fish and up to seven plesiosaurs all jumping out of the water. The actors were shot against greenscreen on an articulated raft set piece with the CG Razorfish, Plesiosaurs, water, storms, a sail and the bioluminescent glow fish giving off underwater all composited into the live-action plate. And while the scene was shot with on-set rain pouring down, Frantic rotoscoped out a significant portion of the rain and recreated it in CG to ensure the rain across the entire scene was seamlessly consistent.

With the Raft scene featuring an all-digital ocean, ensuring that the liquid simulations were realistic was key. To do this, Frantic Films used its in-house Flood toolset consisting of Flood: Surf, Flood: Spray and Flood: Core, all of which integrate seamlessly with each other and with 3ds Max.

Frantic decided to work in 3-D the whole way through.

Flood: Surf was used to create the overall ocean surface. In building the ocean surface, a medium-resolution display in a viewport using Flood: Surf provided an imperative interactive view of the water surface. Then, when rendered via the NVIDIA Gelato renderer, included all the finer subpixel displacement necessary to create a believable liquid surface.

When characters or objects would interact with the ocean surface, Flood: Core was used to provide the gross displacement of the water and was then combined with the ocean surface simulations. Then, for the significant interaction shots that resulted in particles and spray, Flood: Spray was built to simulate these key splashes and interactive events.

Prior to writing Flood: Spray, the studio evaluated several tools and, in the end, made the choice to write its own particle simulator from scratch. Wiebe designed a custom particle-based fluid solver as well as surface blending tools, allowing the Flood simulation to blend back to the ocean surface in a controllable manner.

"I think by far the most challenging aspect is the fact that most of the traditional 2-D compositing tricks don't work in stereo," Shand said. "There is usually a lot you can do in 2-D to finesse a shot, fix CG issues and whatever. However, in stereo it needs to be done in 3-D so that the element is grounded correctly in z-space and has depth within itself."

Some of the vfx facilities on Journey to the Center of the Earth opted for a 2-D workflow until converting to a 3-D workflow at the very end, but Frantic decided to work in 3-D the whole way through.

Frantic even built two screening rooms for the film in Vancouver and Winnipeg so it could play back everything in 3-D.

"We looked at the various ways we could composite the show," Shand explained. "We considered having the second eye auto generated through advanced scripting, which would require some additional artist tweaking in the end. However, very early on we received some test footage to play with. We found that our compositing package Eyeon Fusion could easily handle the memory requirements of compositing both eyes simultaneously.

"We also found that if we paired the eyes into a single image, that it made managing all the elements far simpler for the compositor. So, if you were to apply a color correct blur or whatever, it would be applied to each eye evenly, and at any time the shot could be rendered no matter what the state and screened in stereo. If a stereo problem was found, then it was very quick and simple for the artist to find the problem in the flow, since both eyes remained together the whole way through.

"This was very important because our actors were surrounded by CG rain and splashes, and so a lot of work went into seating everything into the right stereo depth. We created a variety of custom tools within Eyeon Fusion that gave us the ability to view the shots in anaglyph stereo at any stage in the composite. This gave the artists the ability to assess their shots in stereo at their workstations before we would screen them. In the end, we were very satisfied with this approach."

Mokko's Marc Rousseau observes that compositing in a non-stereo feature is all about cheating, while in the stereo world, the basics have to be re-mastered.

Frantic also did custom development to facilitate how the metadata translated to the actual camera rig to simplify final 3-D renders. The 3-D camera systems used to shoot Journey were equipped to change their interocular distance dynamically, making the process of tracking and then applying a basic offset to the second camera impossible. Fortunately, the cameras recorded additional metadata that captured all of the animated interocular movements. Frantic wrote tools to extract this data and used it to generate the second camera. The tools also allowed for some additional tweaking to correct imperfections in the information recovered from the footage.

Marc Rousseau, the vfx supervisor from Mokko, discussed their compositing contributions: "Compositing in a non-stereo feature is all about cheating. Cheat perspective, cheat distances, cheat rotopaint, etc. In the stereo world, this is not possible at all. So everything we have been working really hard at mastering over the last 15 years is now obsolete. So, we then needed to go back to the basics of compositing and CG and forget all the cheats we have come to use."

Hybride created 234 stereoscopic vfx shots for the film using SOFTIMAGE|XSI. A total of 80 Hybride employees contributed to the vfx production for more than 15 months in order to create the various digital visual effects. Character and object animation included "glowbirds," skull and snapping plants; interior and exterior environments included diamond chamber, volcano, grotto, lagoon, mushroom grove and thermal river; while organic effects included embers, lava, smoke, water fumes and dandelions.

Pierre Raymond, Hybride's visual effects producer and supervisor singled out interaction with a 3-D environment as the most problematic.

"Our biggest challenge was to have live actors and various objects interact in 3-D stereoscopic. Not only was it imperative for the animation to look natural, realistic and fluid, but it also had to interact perfectly with the actors' actions in a three-dimensional environment. Another challenge was striking the right balance between stereoscopic images and visual comfort throughout all 234 shots. And so, it was with these parameters in mind that we approached this daring project, which allowed us to utilize all of our departments and know-how."

All the vfx principals who worked on Journey agreed having live actors and objects interact in 3-D stereoscopic was the biggest challenge.

Aaron Dem, the former vp of production for Meteor and now president of production for Lumiere VFX, said they worked on several sequences. "We did full CG virtual environments for all of these sequences. We did effects animations throughout these sequences. We did the dinosaur. For the Mine Ride, [there was] a two-minute CG sequence and full environment that we did. We also did the Mine Crash and the Mine Entrance. Those were full CG environments as well.

"Challenges working on a stereo 3-D film were multiple. One was basically manipulating the stereoscopic images when Eric wanted the photography repositioned. We had to redo a whole layout and then figure out all the camera information and the 3-D information and then re-track that, replace it in a full layout and then re-render all the elements. Obviously the tracking was difficult and the rotopainting was difficult. The tracking primarily was the most difficult part.

"We built a proprietary software to deal with the lens issues within Maya and also we received some other data from the camera so we used that as well to get us started on our pre-tracking. Then we used 3D Equalizer. They [3D Equalizer] worked hand-in-hand with Meteor to develop new tools as well to tackle the challenges that we had to deal with in stereoscopic.

"The project was challenging because it was the first live-action stereoscopic film produced with the new technologies. So, being on set for almost all the shoot, you got to see first hand how the camera technology worked and processed daily. We'd shoot something then go into a screening room to make sure it looked great.

"Just the process of re-lighting stereo shots for a film of this magnitude was definitely a challenge because Eric had a big toolbox to work with, and he definitely wanted to use every tool in his possession. So, it was a challenge to keep up with him, and it was a pleasure to work with him. I think at the end of the day, it'll prove the new technology is here to stay, and it's a great theatrical experience."

J. Paul Peszko is a freelance writer and screenwriter living in Los Angeles. He writes various features and reviews, as well as short fiction. He has a feature comedy in development and has just completed his second novel. When he isn't writing, he teaches communications courses.

Tags