Janet Hetherington chats with top vfx experts and creative professionals about how new technology is making big ideas possible and affordable for commercials, music videos and the Web.
In the beginning was the idea.
In the case of Motorolas commercial spot to promote its PEBL phone, the idea was nothing less than a realistic and fantastic depiction of the dawn of time. While time travel was out of the question, other technology offered solutions that could make the commercial come to life.
Technology was an important part of this commercial right from the start, says Ed Ulbrich, svp of production for Digital Domain. The firm worked with visionary director David Fincher on the PEBL spot. The concept of a million years of evolution was impossible to shoot, so there was no other way to go. We used CG we could get the shots we wanted, almost entirely CG, except for the phone and the person at the end. We are able to create highly sophisticated images, and these images cut through the clutter of TV. The technology is liberating. The bar is set very high right now.
Ulbrich credits the success of the spot on a 14-year working relationship with Fincher, who is constantly pushing the virtual envelope. David Finchers attitude is, if you dont have to physically build it or craft it, then why do it? David Fincher was doing previs before it was cool to do previs.
Digital Domains war chest of proprietary stuff from the Day After Tomorrow allowed the firm to complete the PEBL project in some eight weeks with 24 people. Ulbrich also notes that that using vfx technology provides a way to tell big stories with way less money.
A Virtual Cast of Thousands
Indeed, technology offers solutions to achieving big effects with a smaller budget, and that includes the cost of performers.
Huge crowd scenes were needed for Carlton Draughts high-concept, award-winning Big Ad (for Fosters Australia). Animal Logic, Australia, created 3D computer-generated human extras for all the large, aerial crowd shots, clothed them in flowing robes and then brought them into Massives award-winning specialized crowd simulation software. They were then replicated thousands of times over, with each character being assigned its own random movement and direction.
Animal Logic technical directors then added behavioral controls and performance parameters that allowed the digital humans to respond to their environment and to the actions of the people surrounding them. This process created the realistic simulation of thousands of human extras choreographed into the mountain valley in which the ad was set.
Some jobs are conceived with a particular technique in mind, where as other spots are written with no idea how they will be achieved, comments Andrew Jackson, an Animal Logic vfx supervisor. I think the writers of the Big Ad did have the Massive software in mind when they came up with the idea.
About a month prior to the shoot, 3D previs began to determine the camera angles of the crowd shots and the number of people that would appear in each shot making up the various shapes. The environments for the ad were shot on location in New Zealand. As well as integrating the 3D crowds into these environments, Animal Logic was responsible for creating matte paintings to enhance the backgrounds, giving them a more epic, mountainous feel.
Today, commercial concepts are much more ambitious because of the freedom technology offers, notes Bill Watral, lead effects td, Charlex. In the 90s, a large crowd shot would have been prohibitively expensive. Today simulated crowds offer a cost-effective alternative.
Watral says that Charlex was able to supply a spot for a Verizon Wireless campaign that involved crowds of people and vehicles. A live-action shoot of 10,000 people turning and walking down a city street mixed in with trucks and other vehicles would be an enormous undertaking and be extremely expensive, Watral explains. At Charlex, we recently supplied a shot just like this in a little over a week using Massive and proprietary software. We were able to get the shot previsualized, animated, textured, rendered, and composited with about 15 artists. Most of them only worked on it briefly.
We were able to turn the spot around very quickly and for a much lower cost than a live-action shoot, Watral insists. We used a combination of proprietary software and Massives autonomous agent 3D animation software to achieve a quality product in record time.
The technology is there and more and more studios are jumping on board and using crowds done in Massive to populate scenes, agrees Jordi Bares of The Mill, London. It is becoming a vital tool.
When asked how cost-effective that tool has become, Bares replies, Well, if you consider hiring 500 extras for 10 days and renting a big stadium for 10 days and having the whole crew for 10 days the money rapidly adds up to crazy numbers. We can do the same job while also freeing up the camera to be placed anywhere in the scene for a fraction of the cost. But I insist the real benefit with something like a scene done in Massive is that suddenly the director does not have any constraints when moving the camera. Theres no locked camera, no zoom limitations, no multiple passes.
The PlayStation 2 Mountain commercial was the first spot that had us producing virtual crowds in Massive, Bares suggests. The concept called for thousands of characters climbing, running and piling on top of each other in Brazil, and our approach was to use Massive to create and render 148,000 people and beef up the group of 500 actors filmed on set. Some other interesting past ads that feature Massive crowds in stadiums, going through city streets, etc. include The Other Game for Nike, Chess to Arca Ex and a campaign for KNOW/HIVAids.
Bares says that The Mill uses autonomous agent crowd software from Massive not necessarily because it helps fill up a stadium, but because it gives the director freedom to move the camera. Production people love this approach because it is not only cheaper to do huge crowds virtually in post, but it also makes the director happy at the same time, adds Bares.
However, technology offers more than just crowd control. We recently worked on a commercial for Thorntons called Save My Bacon, Bares says. It is fully CG and directed by Ruairi Robinson. The spot features a turkey, a farmer, his wife and the fatal destiny the turkey will face on Christmas. The challenges were many, mainly the size of the project and the level of complexity of the work from feathers to cloth, to hair, skin, lighting and delicate simulation of lens effects. The reward was establishing a bond and level of trust with the director that can hopefully be used on future collaborations.
Reinventing the Picture
Sylvain Taillon, managing partner/exec producer, Topix, Toronto, makes the observation that technology is both extremely important and somewhat irrelevant when it comes to addressing a specific project.
Technology is really just a set of tools that we draw from, but a specific technique should never drive the creative vision in and of itself, Taillon says. When a vision, a challenge or a problem presents itself, our approach is always to first envision what would be the best imagery to serve the communication objectives. We try to think first in creative terms rather than technical terms. And then we look into what the best methods, techniques and tools would be to achieve our vision. So, yes, technology is extremely important in that it helps realize the vision. The goal should never be shaped by technology first.
Still, technology comes in handy. Taillon recalls a commercial that the company completed last year for the Canadian Tourism Commission. Its a good representation of how CG, matte painting and compositing can come together to serve a vision, he says.
Taillon explains that the commercial features a transition between the stress of city life and the peacefulness of a Muskoka setting. The spot starts with various shots of people in offices, waiting rooms and an apartment, all obviously affected by the stuffiness and greyness of their surroundings. Finally, the camera, set outside the windows of an apartment, follows one woman as she walks through her home and finds herself, now wearing a different outfit, walking past the brick wall of the building straight onto a dock over a calm lake to finally sit on her Muskoka chair to take in the scenery.
We were originally asked to create the transition, a simple articulated split screen, really, and add some beautiful cottages across the lake, Taillon says. But on the shoot day, nature wasnt so co-operative and the lake was very wavy and the sky was a menacing gray. So we ended up close-cutting (rotoscoping) our lady and the dock and essentially replacing the rest of the environment with a computer-generated lake, matte painting elements for the other side of the lake and even placed a canoe gently moving across the lakes surface.
This kind of undertaking, without the latest tools we wrote our own water shaders wouldve been practically impossible. In order not to get bogged down by the physical reality of a shot, we try and think of it all as pixels on a screen first, and then break down the need elements to recreate the desired image, and we rely on various tools to create the components of that reality. CG water, painted horizon lines, careful compositing, color correction and a lot of patience and attention to details was all we needed to turn a stormy day into a picture-postcard finish to a hectic day.
BLIND has used technology to the fullest when working on spots such as the 2005 MLB All Star Promo. The challenge was to produce a photoreal pinball machine that not only looked cool to the average baseball fan but also noticeably functioned properly to the pinball aficionado, says Santino Sladavic, exec producer, BLIND. The reward was working with an actual pinball table designer, the finished product and the delight of our client.
The technology allows us a tremendous amount of creative freedom in exploring new ideas and approaches that would have been cost prohibitive a few years ago, adds BLIND creative director, Chris Do. In conceiving the idea for Fox MLB, we were able to pre-visualize a lot of our camera moves/angles with a 3D camera. By the time we were ready to shoot we have everything figured out.
It also allows us to create tighter and more visually specific animatic treatments, adds creative director Tom Koh. We can seamlessly move from storyboard development into a refined animatic to help us better illustrate our ideas with clients.
In another spot called Callaway, BLINDs challenge was to create a dynamic commercial, illustrating the design of Callaways new driver and its internal components. We were able to use the models, direct from Callaways engineering team, to create an animation with the actual product. In addition, we had the flexibility to create a photorealistic animation of the club, which we could change on-the-fly at any point in our production schedule, says Koh.
For NFC Playoffs on Fox, BLIND was given the task of creating a realistic environment depicting an epic football battle scene. Since the entire environment was created in CG, we had the ability to composite the athletes in a scene where we could control all variables from the height of the grass on the field to the amount of turbulence in the winds, Koh explains.
The Medium is the Message
Jay Lichtman, vfx producer for Brickyard VFX, says, All the technology is there to push ideas. Many people want to use technology but dont know how to do it or what it entails. This has given us, as a vfx studio, the opportunity to become more involved than ever.
Lichtman adds that CG is factoring into more and more projects at Brickyard, so much so that the company just added a dedicated CG department. He says, The difference is that whereas 15, or even five, years ago, you were doing digital effects because you could do it, now you are doing it because the project says you should do it. CG is now used in much the same way a camera or actor is used; it being the only logical way to create a desired effect.
Brickyards full-time CG department can be scaled up depending on the job. A couple of recent CG-intensive jobs saw us expanding to nine to 10 people, Lichtman says. Our CG artists are extremely well versed in scripting and software so they can be productive even with a smaller number of artists and still compete.
Not surprisingly, high-def (HD) is very much in demand. We have done four jobs in HD just in the last month, Lichtman continues. Working in HD is a challenge due to file size, resolution and the way the work is split among the team. We finished a Super Bowl XL ad called Sports Heaven for Mobile ESPN that was a major project. This spot has sports iconsathletes, racecars, football teams, marching bandsappearing all over a city. There were tons of CG elements and compositeseverything from jets to buildings to ski lifts to a city park-sized Heisman trophy.
At the same time we were also working on a Winter Olympics campaign, doing things like CG water, mountains and snow in one spot and virtually creating a CG world in another. The sheer volume and diversity of the CG was an interesting challenge for sure.
Brickyard uses HD-capable Discreet Flame and Smoke systems for compositing, color correction and image treatment, boujou by 2d3 for 3D tracking, Maya for 3D and RenderMan and mental ray for rendering. Every new version of software, every faster hardware processor makes things easier, Lichtman says. Also, its great that the platforms are opening up to be more customizable. Lichtman notes that its getting tougher to wow the target markets with computer visual effects. The audience doesnt want to see CG just for the sake of it, he says. We need more than that. Everyones gotten more savvy; people are used to looking at CG on computers, televisions and movie screens. Everyones looking for perfection and perfection doesnt necessarily mean any old effects rather, smart effects.
In fact, todays smart effects technology permits the impossible to become possible. When Bell Canada wanted talking beavers for a long-term product campaign, it turned to CG. After all, we couldnt train real beavers, jokes Benoit Bessette, vp, national client leader (Cossette), who handles the Bell account.
Bessette says that Frank and Gordonvoiced by actors Norm MacDonald (Frank) and Kenny Campbell (Gordon)are really two pals who just happen to be beavers. We wanted to spots to be humorous, Bessette says. It was important for Bell to be perceived as less corporate and more approachable.
To create the CG beavers, Bell and Cossette turned to Buzz Image Inc. All the Bell Beavers spots were done in a pretty tight schedule, advises Alexandre Lafortune, senior CGI artist. For its flexibility in post-production, SOFTIMAGE | XSI was used for modeling, hair shading, rigging, animation and rendering as main 3D software.
Throughout the project, all performances and lip synching were hand animated by a team running from five to10 animators. No use of motion capture was involved, Lafortune says. A small team of three to four persons was handling the lighting (XSI on PC hardware) and pre-compositing (Apple Shake on PC). Vfx supervisor Benoît Brière brings back references from the set pictures, mirror ball environment, etc. which helps the lighting and compositing teams to achieve the final integration. Some of the live-action plates have a moving camera. For those shots, we used a 3D tracking software (2d3 Boujou) to recreate the camera move inside XSI. Final compositing was done on a Discreet platform.
The Bell Beavers TV commercials first launched in Quebec, and having animated spokes-critters proved to be a boon. The dialogue for the Quebec ads could be targeted to the French-speaking market, and the beavers themselves have different names (Jules and Bertrand), voiced by different actors (Patrick Robitaille and Laurent Paquin, respectively). The spots have since launched throughout Canada in English with Frank and Gordonwho even have their own websitewith three successful launch spots and five product spots.
The HP Flea Circus spot for the web also features technology-enhanced creatures. Bent Image Lab created CG fleas using an unusual method called stereolithography that merges the real and unreal, and combines stop motion and 3-D. The company has created the STL file extension that reflects its stereolithography process that allows artists to create something in the computer, print it out, xerox a series of sheets, use them as live props, and then reintegrate into the computer.
We built a CG flea, built a 3D prop in CG, printed out the prop, then matched it back in, says Bent Image director/partner David Daniels. The props became objects, and with CG, we were able to match shadows. The process is virtual, then real, then virtual.
The spot features three fleas a leader, a tall one and a fat one and the original story had them jumping off of a cliff. Conceived as an edgy web spot, the original premise was intended to be more violent and sexually oriented. Whomever was in charge of advertising at Hewlett Packard changed, and suddenly the spot had to go on the corporate web page, Daniels says. It turned into yet another commercial. It had to be softened up. The spot took about 10 weeks to complete with about 20 people working on the project.
Bent Image Lab utilizes a wide variety of animation techniquesmany of which are mixed media including 2D, 3D, clay, foam, stylized live, and Bents own Strata-Cut and STL processes. While there is a lot of experimentation going on at the Lab, Daniels feels that computers, and CG, are opening up a new future for the more traditional stop-motion animation. CG is a great chameleon, Daniels adds. Its a magic box that becomes anything you want.
In the Pink
The great chameleon has become the darling of music video makers as well and stars want the cutting-edge technology. Five years ago, computer were about one-tenth the speed that they are today and things took 10 times as long to do, says Bert Yukich, visual effects supervisor, KromA. As a result, effects that were inconceivable five years ago, because they would have taken too long, now are very real possibilities.
We are currently working on a Pink video that involves CG matte paintings and CG page turns, Yukich says. In the past, we might have had to do the matte paintings as simple 2D effects, but now, with todays technology, we are able to create a full 3D environment in about the same time.
Only the biggest artists can afford five minutes of high-level CG, Yukich adds. Although the cost of doing effects like that is coming done, music video budgets are generally shrinking. To some extent they are meeting in the middle and you are seeing a somewhat greater use of CG.
KromA has created full CG environments for a number of videos. A recent example is the Rob Thomas video, Lonely No More. In that instance, it involved a relatively large budget and a huge artist who wanted to do something that hadnt been done before.
Yukich says that Lonely No More was shot on a greenscreen stage. All of the walls, ceilings, floors and most of the furniture were CG, he says. The biggest challenge was creating a look. I had to play the role of production designer and create and dress the CG sets. The sets animate throughout the video the walls flip and the furniture rotates in 3D space. It was more efficient to create it all from scratch.
The hardware used by KromA is mostly Windows-based PCs. We have about 100 or so of them, ranging from single-processor 3GHz P4s for our render farms to dual-, quad-, and even 16-wave processors for our workstations, Yukich says.
The software used for compositing is a hardware/software system called Avid DS Nitris. We chose it over Discreet and Quantel platforms because of its paint tools and its cache feature, advises Yukich. We use SOFTIMAGE|XSI for 3D. It interfaces well with the DS because it is made by the same manufacturer, making it easier for our artists to use both programs. It is the only combination of compositing and 3D tools with that advantage. XSI is one of the top 3D programs available. It is more expansive than some other programs, but it has certain speed advantages that make it worthwhile.
As for matching sound to picture, Yukich notes, It is very easy to sync CG with music. We have some filters that will automatically sync animation to a musical rhythm.
Yukich says that a less obvious result of advances in technology is that they have made beauty work a standard production tool. Again, at an earlier time, extensive beauty work wasnt practical for videos because that part of the process could take a week alone, Yukich says. Today, image enhancements can be done quickly and with a lot of subtlety and sophistication as a result, it is something we now do all the time.
Everyone wants their effects to look photoreal, Yukich comments. Our goal is to make the effects look as good as photographic elements. The only time we vary from that is when the director wants a certain look or feel.
Photorealism was never the goal of the U2 music video for song Original of the Species, with effects by Spontaneous, New York Original of the Species was conceived around the idea of having a 3D head as the centerpiece of the video, says Lawrence Nimrichter, director of animation/associate creative director, Spontaneous. While the design and style was never meant to be photoreal, we knew that we wanted to animate a CG Bono singing.
When the director came to us, we looked at the advancements that Softimage was making with Face Robot when it came to facial animation and its ability to work with motion capture (MoCap) specifically and we knew this was a great opportunity for our creative concept and this advancement in technology to walk hand in hand, Nimrichter says. Otherwise, it would have been a painstaking process to animated Bono singing with realistic motion, and the schedule just did not allow it.
Nimrichter recalls that the concept for the video came to the director, Catherine Owens, while working on in-concert animations for the song. The band saw some early animation tests for the showpiece and liked the concept of the full-screen generated faces so much that they wanted to use it for the music video of the song. This established a germ of an idea that evolved over the following weeks.
Before the job was given the full greenlight, Nimrichter says, we started looking into how to pull off the job. The schedule was very tight and there would be a lot of 3D faces to create. Initially, there was supposed to be no animation on most of the faces they were supposed to be more death mask like. But we knew we had to animate Bono. We had seen the Face Robot samples from Softimage and contacted them to see if that would be a viable way of pulling off Bono. They thought it would be possible, so we presented Catherine the idea of doing a 3D laser scan of the band members and mocapping Bonos performance, while using Face Robot to give the MoCap a very natural look. She liked the idea and got the band into it as well.
The catch was that because of the crazy schedule of the band, which was in the middle of their Vertigo tour, and we had only four days to logistically gather all the necessary elements. We needed to do the Mocap and Laser Scan in Chicago to coincide with the bands schedule, and this included getting the guys from Softimage there to sign off on the MoCap data for Face Robot. We had absolutely no time to do any tests or R&D. We got the greenlight and had to dive in.
To create the heads we had the guys at XYZRGB come to Chicago with a laser scanning system. They scanned each of the band members and gave us the scanned meshes, along with RGB and normal maps. Based on the supplied meshes from XYZRGB, we quickly modeled the characters to be animated with Face Robot, which now included some of the shots of a female face. For more artistic sections of the video, the raw scan data of the band members were used.
Nimrichter says that the most challenging part of this video was getting the Bono sequences approved. Since he is a rock star, every camera angle, the model and all of the animation was scrutinized over and over again. Our Bono was stylized in wire frame, which helped, but we had to go through many versions before getting him approved. If they werent totally happy with the wire frame animated Bono, they would have not released the piece at all.
We did have the chance to work with the band and it was a great experience. They wanted to make a video that was not so typically commercialized. The band wanted it to be more of an art piece and this whole project, right from the start, was a very collaborative creative process.
When our creative team at Spontaneous shipped the final pieces, we did get a really nice bit of feedback from U2. We went to see them play in person when their Vertigo tour was in New York, and before they played the song Original of the Species, he thanked us on stage. It felt very good to know they appreciated the crazy hours of work that went into creating this video.
Motion capture has found a place in music videos, but its also cropping up in other productions. Motion capture has become very dependable and is being used on all ends of production from commercials to some of the biggest feature film projects out today, suggests Scott Gagain, exec producer, House of Moves. Weve been capturing body movements for years now so were moving into things like full performance capture to record the actors entire performancehands, face and body, all at once.
I personally think that the commercial arena is the perfect place to test out new technologies. Our clients can push the envelope and capture extreme details. They can build concepts that put the power of having an actors total performance captured digitally available in 360 degrees.
We are working on a number of commercials, where the companies are using motion capture to lend realism to CG characters in ads. For PSYOP and McDonalds, motion capture done last year at House of Moves allowed the client to bring the full-body performances and facial expressions of a cast of actors to line-drawn 2D characters. We have captured dancers, wrestlers, martial artists, musicians, animals and all manner of props. Even on the most complex shoot involving multiple performers, we are now able to deliver data that accurately reflects reality and ultimately allows clients to bring realism to their CG work.
Recently we worked on a commercial that utilized our full performance capture methodology. The client was able to bring a creature to life using the motions of the actor hired to play the role. These motions were then retargeted to the characters geometry so that the motions captured would fit and drive a digital character of a much different proportion and size. Years ago, the idea of capturing the face, body and hands simultaneously was unheard of. Now we do it with ease and with multiple subjects all while capturing things in realtime.
Gagain adds that House of Moves is outfitted with 100 Vicon MX 40 motion capture cameras, which he cites as the industrys new standard for capturing detail. We use mostly proprietary software (Diva and Vicon IQ) to process the data, but our software has plug-ins to every 3D package available, Gagain says.
Gagain notes that technology is making production easier. Being able to motion capture and view the data or create hand-held or stationary camera views with virtual cameras in realtime is a huge advantage for directors, he says.
Todays technology provides the artists with exceptional tools, but the idea is still first and foremost on the agenda.
The great thing about how animation/visual effects technology has progressed is that it now allows for creatives to concept a commercial or music video without limitations, comments Nimrichter of Spontaneous. The animation and effects artists working on the projects are well aware of the current gray zones in animation where things get tricky to pull off. Because the industry as a whole has stepped up and delivered such great mind-blowing work, it has created a comfort zone with non-technical creatives who are conceptualizing music videos/commercials/film, which in turn feeds the industry with more work.
Janet Hetherington is a freelance writer and cartoonist living in Ottawa, Canada. She shares a studio with artist Ronn Sutton and a ginger cat, Heidi.