'Prometheus': Bringing 'Alien' into the 21st Century

Weta and Rising Sun Pictures help Ridley Scott on his sci-fi return.

'Prometheus': Bringing 'Alien' into the 21st Century

Prometheus marks Ridley Scott's return to sci-fi after three decades and comprises a whole new take on the Alien franchise. In fact, the quasi Alien prequel represents the director's full embrace of CG and his first foray into 3-D. Richard Stammers of MPC served as production visual effects supervisor, and was proud to help usher Scott into the 21st century of VFX and 3-D.

"Ridley wanted to shoot it like a 2-D film and still rely on all the regular depth cues that he's used to using over the years with lots of atmosphere and smoke and haze," Stammers explains. As for the approach to CG, Scott only relied on it when necessary and to enhance the facial performances of the various alien creatures.

Stammers divided the VFX between various studios. MPC, the lead company, created the planet environments, space shots and space ships. I spoke with Weta Digital, which worked on the alien creatures, and Rising Sun Pictures, which primarily pitched in with some comp work on a key sequence, the Storm Rescue sequence.

Weta created four aliens: the Engineer, a tiny trilobite, a 14-foot squid-like creature, and the iconic alien from the earlier movies. However, the Engineer, by far was the most interesting and challenging. He's the catalyst that sets the story in motion, turning out to be the mysterious Space Jockey from Alien, and the key to the whole evolutionary cycle that binds humans to the aliens.

"We ran some tests [on the Engineer] to basically convince Ridley that we could do better than prosthetics," suggests Martin Hill, Weta Digital's VFX supervisor. "He's like an Adonis, the perfect humanoid with white skin. They had a maquette built, which Ridley shot and lit. So we built the same bust from scratch and replicated the lighting and the skin quality and the translucency, but we made him move and made him articulate with blinks and expressions. On the basis of that Ridley decided to go digital.

"Ridley wanted to get as much in camera as possible, so it was very much the antithesis of a virtual studio in a way. On set, he had an actor completely made up with silicon over his whole body, which he shot for non-visual effects. And that presented a bit of a challenge for us because, if we want to make a visual creature, we add musculature and make it as physically correct as possible. But, of course, we have this slight dilemma here. We need to match the onset Engineer as well as other creatures later on and make something convincing and compelling and obviously very real. And so we built this digital Engineer and there are some interesting compromises. What we're actually representing is an actor and what we found straight away was that we can make a digital humanoid with pretty convincing skin. We've advanced the technique since Avatar for our subsurface algorithms. But trying to replicate the human with the extra silicon on it was a completely different situation."

The Engineer presented new challenges for Weta involving new subsurface algorithms to overcome a waxy look.

They actually carved vein patterns into slabs of silicon to get it right. And that presented a whole new series of challenges involving new subsurface algorithms. "To represent a very translucent piece of silicon, you want to increase the depth of all your subsurface," Hill continues. "And the problem is that you lose any sense of internal structure. The light bleeding the Engineer was so deep that he started to look waxy, so we had to advance all our technology to be able to put inner structures within our subsurface. This way we got a sense of the bone or cartilage inside the nose and the bones in the fingers that would actually block internal light. We added extensions to the quantized-diffusion model for rendering translucent materials that was presented last year at SIGGRAPH by Eugene d'Eon and Geoffrey Irving."

Prometheus contains a very CSI-like opening in which we zoom into one of the veins in the Engineer's arm, which also required Weta getting very close to the digital model. "So we needed to get our skin detail to [an appropriate] level. The veins pulsate beneath the skin, but it was an opportunity for a lot of scope when transitioning into different material qualities."

Indeed, there's a merging of alien and human DNA in a very organic sequence. After carving pieces of silicon, Weta pumped oil and water and ink and all sorts of materials through all the veins in this silicon structure and backlit and filmed it. "This gave us a really good library of very natural motion for blood coursing through the veins," Hill adds. "Ridley very much wanted this motion that was like a flock of starlings when the character disintegrates. We have particle effects coming off the Engineer as he's decaying."

Weta used mostly Maya for deformation and sculpting and Mari for textures. Weta additionally used Nuke to process most of the filmed elements that were then transferred to these animated maps for the textures, shaders and deformers. They also applied their proprietary muscle system called Tissue along with the other subsurface, rendering and lighting tools, which is an extension of the spherical harmonics developed for Avatar.

The medpod sequence with the trilobite was tricky as well, involving 2.5D techniques to enhance Noomi Rapace's belly motion and having bits of alien elbow poking at her. "Then we had to digitally recreate all of the medpod tools, such as a laser that cuts across her stomach and spreaders that come in and open her up," Hill says. "Then we had to create an alien that matched an onset alien that was articulated within a placenta all covered in goo and blood that gets pulled out of her belly. We had to matchmove everything and stereo makes that hard to get something so organic accurate."

The ultra alien had to match a puppet as well. "We also had to add a lot of extra design features so we could articulate the mouth the way he wanted," Hill offers. "Ridley was referencing a goblin shark's inner mouth works, which is quite different from the original alien. It was fantastic to get in there with eager reference and to do some extra design work."

Meanwhile, Rising Sun was called on for some extra comp work in the Storm Rescue sequence. The Australia-based company used Nuke, Ocula, 3D Equalizer and Final Cut Pro.

For the Storm Rescue, Rising Sun added stereo particles within a stereo plate with preexisting particles already provided by MPC.

"We were primarily adding stereo particles into a stereo plate with particles already in it with people hanging off wires," explains Rising Sun's visual effects supervisor, Tim Crosbie. "Having existing particles already in the plate was the right way to go. I think they were doubling or tripling the amount of particles in each shot. When we first saw the storm sequence, it was one of those mental challenges to figure out how best to tackle it. You have to obviously take those wires out without taking the particles out. The pipeline we had set up previously for stereo shows was one of those form follows function things. But stereo is now mature enough that there are no big surprises anymore."

--

Bill Desowitz is former senior editor of AWN and VFXWorld. He's the owner of the Immersed in Movies blog (www.billdesowitz.com), a regular contributor to Thompson on Hollywood at Indiewire and author of James Bond Unmasked (www.jamesbondunmasked.com), which chronicles the 50-year evolution of 007 on screen and features interviews with all six actors.

Bill Desowitz's picture

Bill Desowitz, former editor of VFXWorld, is currently the Crafts Editor of IndieWire.

Tags