Prometheus: Bringing Alien into the 21st Century
Prometheus marks Ridley Scott's return to sci-fi after three decades and comprises a whole new take on the Alien franchise. In fact, the quasi Alien prequel represents the director's full embrace of CG and his first foray into 3-D. Richard Stammers of MPC served as production visual effects supervisor, and was proud to help usher Scott into the 21st century of VFX and 3-D.
"Ridley wanted to shoot it like a 2-D film and still rely on all the regular depth cues that he's used to using over the years with lots of atmosphere and smoke and haze," Stammers explains. As for the approach to CG, Scott only relied on it when necessary and to enhance the facial performances of the various alien creatures.
Stammers divided the VFX between various studios. MPC, the lead company, created the planet environments, space shots and space ships. I spoke with Weta Digital, which worked on the alien creatures, and Rising Sun Pictures, which primarily pitched in with some comp work on a key sequence, the Storm Rescue sequence.
Weta created four aliens: the Engineer, a tiny trilobite, a 14-foot squid-like creature, and the iconic alien from the earlier movies. However, the Engineer, by far was the most interesting and challenging. He's the catalyst that sets the story in motion, turning out to be the mysterious Space Jockey from Alien, and the key to the whole evolutionary cycle that binds humans to the aliens.
"We ran some tests [on the Engineer] to basically convince Ridley that we could do better than prosthetics," suggests Martin Hill, Weta Digital's VFX supervisor. "He's like an Adonis, the perfect humanoid with white skin. They had a maquette built, which Ridley shot and lit. So we built the same bust from scratch and replicated the lighting and the skin quality and the translucency, but we made him move and made him articulate with blinks and expressions. On the basis of that Ridley decided to go digital.
"Ridley wanted to get as much in camera as possible, so it was very much the antithesis of a virtual studio in a way. On set, he had an actor completely made up with silicon over his whole body, which he shot for non-visual effects. And that presented a bit of a challenge for us because, if we want to make a visual creature, we add musculature and make it as physically correct as possible. But, of course, we have this slight dilemma here. We need to match the onset Engineer as well as other creatures later on and make something convincing and compelling and obviously very real. And so we built this digital Engineer and there are some interesting compromises. What we're actually representing is an actor and what we found straight away was that we can make a digital humanoid with pretty convincing skin. We've advanced the technique since Avatar for our subsurface algorithms. But trying to replicate the human with the extra silicon on it was a completely different situation."
They actually carved vein patterns into slabs of silicon to get it right. And that presented a whole new series of challenges involving new subsurface algorithms. "To represent a very translucent piece of silicon, you want to increase the depth of all your subsurface," Hill continues. "And the problem is that you lose any sense of internal structure. The light bleeding the Engineer was so deep that he started to look waxy, so we had to advance all our technology to be able to put inner structures within our subsurface. This way we got a sense of the bone or cartilage inside the nose and the bones in the fingers that would actually block internal light. We added extensions to the quantized-diffusion model for rendering translucent materials that was presented last year at SIGGRAPH by Eugene d'Eon and Geoffrey Irving."