Bill Desowitz offers some highlights from last week's SIGGRAPH 2007 in San Diego, and chats with three leading figures: Doug Chiang, Joe Letteri and Kim Libreri.
Watch Animation World Network and fxguide's eight netcasts from the show in AWN's Media Center.
The good news is that the recently concluded SIGGRAPH 2007 saw a slight rise in attendance in San Diego from last year's conference in Boston: 24,000 vs. nearly 20,000, with a slight rise in exhibitors as well. Though still not nearly the draw that SIGGRAPH gets in L.A. where it returns next year, the vibe was again intimate enough to remind people of the good old days. One exhibitor even remarked that there were enough artists among the attendees that made it worthwhile for business.
Still, there were some grumblings about SIGGRAPH losing the power of its brand with all the competition from NAB, GDC and other conferences. Plus not enough big product launches -- besides Autodesk making its usual splash by introducing new versions of Maya, Max and MotionBuilder, and "Massive for the Masses" with version 3.0 for Windows and offering dynamic hair and fur and lanes. And the conspicuous lack of attendance from Europe and the Pacific Rim. Well, rest assured that the powers that be are already doing what they can to improve SIGGRAPH. In fact, they've just announced a host of upgrades and improvements that will impact submission deadlines and enhance overall attendee satisfaction, including "a content selection process that affords focus on timely industry themes rather than strictly on presentation type."
So what was new at SIGGRAPH 2007? Well, the biggest news, of course, was Autodesk's acquisition of Mudbox 3D from Skymatter, filling a void with a sculpting tool that will fit right in with Maya and Max. Not only will Mudbox offer more integrated workflows in entertainment but also in the automotive and industrial design sectors as well. According to Skymatter's Tibor Madjar, who now joins the Autodesk Media & Ent. division along with co-founders Andrew Camenisch and Dave Cardwell, they can now "concentrate more on developing the tools instead of running the company." Plus they will be able to take advantage of Autodesk's IP and programmers. As for XSI users, who were concerned about the acquisition, Madjar said he assured them that they would not be abandoned.
Speaking of Softimage, they unveiled XSI 6.5, with some new production efficiency and specific workflow enhancements, and Face Robot 1.8, which has been enhanced for game development pipelines, including a new shape rig export system, among other things.
Not surprisingly, performance capture arguably had the largest presence at SIGGRAPH 2007. There were a host of new players, including Image Metrics, which prides itself on markerless, makeupless technology capturing "the minute details in each pixel of a particular individual's performance, allowing the actor's personality to shine through the digital character." There was also Xsens Technologies' "Moven," which captures "human motion in realtime with unprecedented life-like fluidity and ease-of-use." And Organic Motion launched "the first turnkey markerless motion capture system."
Of course, Vicon was back in full force with Blade, which "provides a single unified toolset that supports the growing demands of full performance capture, on-set visualization and makes capturing and processing motion capture data for the 3D animation pipeline simpler and more direct."
Mova was also back with its Contour Reality Capture System along with Gentle Giant Studios, unveiling a new 3D Zoetrope that uses persistence of motion to bring to life a series of 3D models of an actor's face captured live.
After the success of ILM's Imocap system on the Pirates of the Caribbean franchise and its ongoing development of a realtime facial capture system, several other prominent studios are developing their own proprietary systems for CG humans as well. However, there is no one-size-fits-all solution, so each system offers its own way of solving particular challenges.
Meanwhile, "new ideas, new solutions" was the rallying cry of Side Effects' president/founder Kim Davidson, who touted the recent launch of Houdini 9, with its redesigned UI and lower price point: "pipeline in a box." Davidson said they are now targeting boutiques without programming teams and that "users don't perceive pain as much anymore."
Luxology's upcoming modo 301 later this summer includes "comprehensive improvements to modeling, painting and rendering and the addition of two new core capabilities: sculpting and item animation." Coupled with the launch of Luxology TV, a new online hub that allows the 3D community to exchange and view high-resolution video clips on Luxology's website, the company is certainly trying to expand its reach.
On the hardware side, AMD offered five new ATI FireGL workstation graphics accelerators (FireGL V8650, FireGL V8600, FireGL V7600, FireGL V5600 and FireGL V3600) with more next-gen GPU product to come from AMD. NVIDIA introduced improved Gelato, "capable of re-lighting 60 frames of a complex scene in 60 seconds," as well as the Quadro Plex Visual Computing System (VCS) Model S4, a graphics server featuring a record number of GPUs in a standard 1U server form factor. Mental images showed off the mental mill Artist Edition breakthrough for hardware shaders, enabling artists "to develop, test and maintain shaders through an intuitive graphical user interface with realtime visual feedback -- without the need for programming skills."
Three Views: Doug Chiang, Joe Letteri and Kim Libreri
I also took the opportunity to catch up with three leading figures: Doug Chiang, production designer on the upcoming Beowulf and now EVP of Robert Zemeckis' ImageMovers Digital, which has relocated as a brand new studio in San Rafael; vfx supervisor Joe Letteri, who is working on Jim Cameron's Avatar at Weta Digital; and Kim Libreri, Digital Domain's vp, advanced strategy, formerly with ILM, an architect behind ESC Ent. and a key force behind all three Matrix movies.
"Part of the reason Bob Zemeckis wants to do this is to create a studio and restructure it differently from any post visual effects houses that he's ever worked with," Chiang explained. "He loves the idea of how a live-action film team works collaboratively. There are no barriers and he likes to talk to all the department heads and knows what's going on and can talk to any member of his crew. And so that is the model he is trying to go for. And in many ways we were doing that on a small scale in the art department on Beowulf."
Chiang said that ImageMovers now has a staff of more than 100, including vfx supervisor George Murphy, animation supervisor Jenn Emberly, dfx supervisor Kevin Baillie and rigging supervisor Tim Coleman (rigging supervisor). Their initial performance capture feature as part of the new distribution deal with Walt Disney Studios is A Christmas Carol: a tour de force for Jim Carrey, who not only performs Scrooge but also the three Christmas Ghosts. "It will be a brand new look for Dickens that we've never seen before," Chiang added.
Meanwhile, Paramount's Beowulf, with animation and vfx by Sony Pictures Imageworks, pushes CG boundaries with performance capture even further than Polar Express and Monster House. The volume of data is certainly much greater for this graphic novel look, as Zemeckis attempts to cross the uncanny valley with more believable CG humans. "The process is very liberating for Bob," Chiang continued. And having characters resemble the actors performing them is not a stunt. Why not utilize the physical abilities of Anthony Hopkins or Angelina Jolie? And yet Ray Winstone looks nothing like Beowulf and performance capture helped solve the problem of aging the character as well as being able to better handle the climactic dragon fight.
As for Avatar, they have all kinds of CG characters, including humans. "Digital doubles are a fact of life," Letteri suggested. "There is no way around that anymore. In some ways, you have a little bit more leeway with digital doubles because you tend to use them sparsely, so if you really need to put the effort into getting one shot to look absolutely perfect, you can. We're in the virtual production part of Avatar. That's going on for about two more weeks and then we're going to New Zealand for the live-action part of the shoot. We're seeing sequences as they come off the stage and it's a very interactive process, which is good because we're able to get cuts and have an idea of what decisions have to be made without actually having to guess too much."
As for performance capture, it is merely one piece of the whole CG, stereoscopic movie, according to Letteri. "The idea is to use that performance in the same way that you use a performance in a live-action film. We want it to be as much as if you're seeing it for real -- or not. That way when you're actually composing shots and you're following the actor around with a camera move, all those sorts of things you're supposed to get with the experience of shooting live action are there."
In terms of CG environments, Letteri said it's an extension of the work they've done on Lord of the Rings and King Kong, which utilized proprietary CityBot, Maya-based software. "We had a lot of discussions right at the end of Kong with Jim Cameron and [producer] Jon Landau about what we would need to do to upgrade it. And a lot of that Kong experience is a good foundation. There's so much more of it, and we're coming up with ways to integrate everything a little bit more organically, so the process is always evolving."
What's new about the workflow on Avatar? "So much is actually being done on a stage that would normally be done by a visual effects facility and that's because of the idea of the stage being treated more like a live-action stage. The data's that input from the camera goes directly into the production, blurring the line between production and vfx. It's much more integrated now and there's an infrastructure that has to be built up on the production side that normally you don't think of. For example, Rob Legato is doing a lot of work putting that data together during the virtual production phase. A lot of that shift is taking place where the problems that used to be specific just to a visual effects facility are now being taken on by production and vice versa because of the way the two are integrated."
As for working in stereo, Letteri said it was daunting at first until he started actually getting into it. "Now that you have stereo you have to be very true to things like the distance between your eyeballs. What seems most real to me is keeping that same distance in terms of the scale of the shot. Those sorts of things come in handy. And just editorially, what we're learning in watching the way Jim cuts sequences together is you have to really pay attention to stereo as you move from one shot to another. You have to be cognizant of what draws the attention of your eye in the frame is different from two dimension, so that when you cut from a closeup to a wide shot, it won't be [jarring] because there's so much to look at."
And what did Libreri find cool at SIGGRAPH 2007? "High quality GPU rendering is here. It's gotten to the point that a high end NVIDIA or ATI GPU can do pretty much everything you can do with classic RenderMan rasterizing. Fantastic! But it's thousands of times faster than traditional software based mechanisms. Last year at DD we used Epic's Unreal engine to do a couple of commercial spots for a videogame called Gears of War -- the results were amazing. The VFX industry should be using more of this tech! And you can see all these solutions beginning to crop up. There was a whole day course on realtime rendering. George Borshukov at EA had a wonderful presentation. He was presenting a GPU version of the UCAP technique that pretty much matched the quality of what we had on The Matrix sequels all running in realtime. He could move lights around, he could load different HDR lighting environments and it all looks really good.
"The other thing that looks cool that I've seen is a lot of great uses of measured lights and materials. Thanks to the groundwork of Paul Debevec with Image Based Lighting and the continued quest for measured BRDFS, it is possible to light and shade a CG object without having to resort to getting a shader writer to painstakingly code up material properties by hand. This year at SIGGRAPH we see the first commercial uses of this technology; we're going to see an explosion of this in the next couple of years. You'll soon have a device that you point and click to measure a sample's BRDF. It'll be at most minutes, maybe even seconds to get a digital representation of how the sample reacts to light. And you can see the buzz of everyone going wild with the possibilities. With measured materials, we don't have to deploy an artist to write shaders -- we can concentrate on creating really cool-looking images as opposed to killing ourselves to make a photoreal CG object. Photorealism should be easy. Beautiful images are what we should be employed to do. And I think that we will see that once people start to get a handle on physically-based stuff and using hardcore ray-tracing, you're going to see a leap in both realism and creativity."
And what does Libreri think of the state of the industry? "VFX is a tougher business than ever. Post-production schedules are getting shorter than they have ever been and the global competition is amazing. Thanks to technology the real crunch of a vfx movie can happen in the last six months and it's hard and challenging. I'm not going to complain -- that's the nature of the modern film industry. We all have to re-think our workflow all the time. Budgets are tight and we're still asked to create incredible stuff. It's as much an art as it is a business, we're all trying to think about how you balance all these modelers, painters, compositors, lighters and animators and yet come out on schedule and budget. And what happens if an edit doesn't come in on time? And it's not the filmmakers' fault. It's just that schedules get compressed and everyone is really strained all through the creative process. You go into any studio -- ILM or Sony or Rhythm & Hues or us in the last few months of a production -- and it's like a war room. Nobody gets hurt, obviously, except for some egos. It's quite an incredible art form to be participating in these days."
Bill Desowitz is editor of VFXWorld.