MPC Taps Maxon Cinema 4D for Skyfall-Inspired Spot for Sony
Parker started with a 3D scan of Craig’s head, which was provided by MPC London, which was producing visual effects for the new James Bond film, Skyfall. Using PolyFX, he was able to control each polygon of the model using standard MoGraph effectors. “I built a ton of animated texture maps in After Effects and used the shader effector to control the timing and direction of how the polygons transformed,” he explains.
To create the sense of the data organically resolving itself into the head, Parker built a series of thin cubes in a cloner that was controlled by a plain effector with a linear falloff. “As the cubes approached the 3D head, they would grow to full size, and as they continued past the head, they would shrink to zero,” he explains.
“Then, I put that cloner and the 3D head model into a boolean object, so when the cubes raced past the head, the boolean would form a negative space of where the head would be.” He then exported the animation from C4D as an OBJ sequence and imported it into After Effect using the Plexus plug-in. Parker used Plexus to form the connected web of lines and nodes that animate across the head. All of these rendered layers were then given to Munkowitz who crafted the very dense and elaborate reconstruction imagery.
Early in the spot, the mystery woman uses her smart phone to pinpoint the location of the intruder. She watches the screen as a wide shot of the city quickly hones down to a red dot that is right outside the door to her building. Munkowitz wanted to do a point cloud representation of the city, but flying to London to take LIDAR scans of the Fishmonger’s Hall building featured in the ad wasn’t in the budget.
Instead, MPC’s 3D department modeled Fishmonger’s Hall and the surrounding London area helped them track and model other elements in the scene. Parker used Thinking Particle’s PMatterWaves node to project particles onto the surface geometry from various light sources, allowing him to control where the stationary particles were sprayed onto the city scene.
“I put a spotlights aiming at the front, side and top so I could get a volumetric representation of the city from each X, Y, and Z axis,” Parker explains. “Wherever the light didn't shine, no particles would be created, and this replicated the LIDAR look perfectly.”
The goal was to make the phone appear to be generating data from the geometry in real time, says Munkowitz. “Nav rendered out a bunch of assets for us to work with, and I, again, took it into After Effects and put it together,” Munkowitz explains. “That’s the nice thing about how we work together; he gives me all the raw assets I need to construct a compelling final image.”
These days, the duo is working together on the sci-fi thriller Oblivion starring Tom Cruise. “GMUNK and I have got a great rhythm going here,” Parker says. “It’s a great mix of film editors, composers, post viz artists and sound effects mixers all in the same space. If we want to show the director something, we just say, ‘Hey, Joe [Kosinski], look at this.”