Barbara Robertson discovers how Worlds Away uses 3ds Max to animate the new IFC series, Hopeless Pictures.
Kim Lee is the creative director and digital fx supervisor at Worlds Away in Manhattan, where a 22-person team creates IFCs animated series, Hopeless Pictures. Ten animators working in Autodesk 3ds Max crank out one 15-minute episode each week. The pilot episode debuted Aug. 19. The first of eight episodes created at Worlds Away debuted Aug. 26. Renowned character actor Bob Balaban directs the series.
Barbara Robertson: How did you become involved in the Hopeless Pictures project?
Kim Lee: The producers, Trigger Street Independent, who have an office across the hall from us, were in the process of putting a pilot together in Flash. When they asked if wed become a liaison between the director and the Flash animation studio, we suggested putting a team together to create storyboards and animatics. While we were putting bids together for that, they asked us to bid everything. We said that if we did it, we wouldnt want to work in Flash that we could get the same look using the original artwork, but we wanted the flexibility of having the 3D camera. They went for it. The director was very interested in having the flexibility to do slight moves, to have a rack focus. And the producers liked that we wouldnt have to redraw every pose.
We arent a Flash shop. We knew Max would give us so much more flexibility. If we were creating one-off characters 3D wouldnt make sense because building them takes too much time, but for series work it becomes cost effective.
BR: Did that mean the look of the series changed?
KL: No. In this production, there are no 3D models like you traditionally think of 3D models. Were making flat two-dimensional puppets that live in 3D; a bunch of flat planes like pieces of cardboard that are all linked together and texture mapped.
The original character was designed by Brian Smith. We were given all his artwork and the artwork generated in Flash for the pilot, and we were able to repurpose it.
We wanted to maintain the style of the original artwork. That meant we couldnt do flat-shaded characters like South Park. And, we couldnt see it being feasible to write a shader that reproduces paint strokes. So, were just making the original artwork move. Now, our artists know [Smiths] style so well they can create characters.
BR: Was it difficult for animators to make the transition from 3D to this style of animating?
KL: If you can animate, we can teach you how to do this in a day.
BR: OK, so how does it work?
KL: If you were to look at the characters youd see flat, two-dimensional puppets that live in 3D. We have eight versions of each main character: front, three-quarters front right, right side and so forth, all the way around; eight separate models from different perspectives on flat planes. The planes are transparent where you dont have artwork.
We dont have a turntable. All the views except one are off camera. When an animator activates a slider, the models swap out seamlessly, the texture maps change and that gives the illusion that the character is transforming. All the characters are the same scale, so they can be merged into any scene. Every character works the same way.
BR: How do they animate body parts like arms?
KL: For the arms, we have long, thin planes with arm artwork on them. One end of the plane is attached with a pivot point to the shoulder. So to have a character wave up and down, you could grab the pivot point and rotate the plane. We also have a bend modifier in the middle of the plane so the elbow can bend. And you can move parts forward or backward from camera view to make it look like characters body parts are interacting.
When an arm is animated in one view, the animation gets instanced across all the other versions of the character automatically.
BR: What about facial animation?
KL: We have one piece of geometry and a slider that changes which picture is shown. To make the mouth move lets say you want eight mouth shapes. We have one plane with a mouth on it, the neutral mouth. In Max, we create a multi/sub-object material. Its basically a material made of multiple materials. So what we do is hook the multi/sub-object materials to a slider. When the animator picks the notch for material one or two, all the way to eight, it swaps pictures in and out. We use the same mouth shapes for all the characters. Notch number three is always the O shape no matter which character youre animating.
BR: And the backgrounds?
KL: The backgrounds are done using a traditional multi-plane thought process. The art is mostly created by Max Ehrlich and Perry Gargano, with everything broken out in PhotoShop layers, including the props like desks and things like that. Max does the backgrounds and Perry does everything else. Sometimes the camera travels with a car and we put mountains in layers in the background. In other shots, we use the artwork for the road in a more traditional 3D sense by texture mapping onto a plane.
BR: Do you render the scenes in Max?
KL: Yes, we use Maxs default scanline renderer. We do what most people call self-illumination renders: There are no lights. We render almost as fast as we can get shots into the queue. We just had to re-render a scene with a monkey and a couple having sex, and it took five minutes. We dont do any atmospheric effects, even though we could. If theres a season two, we might try to mix in some of that stuff.
BR: So, could you quickly run through the whole process?
KL: For characters, first the artwork is generated. Then the different views for the characters are generated. They go to the rigging department, which maps the artwork onto planes for the characters. Then the characters are saved out in files.
For episodes, we come up with the boards and once theyre approved, we bring them into an Avid Xpress Pro at full uncompressed NTSC and our editor times the whole thing out. Once we have an animatic approved by the director, it becomes the blueprint for all the departments.
Then, layout creates a 3ds Max file for the shot with the exact duration, brings in the audio, the backgrounds, the props and the correct versions of the characters. We dont want our animators to have to think about anything except animation. They open a file and the sound is set up already. They dont have to find anything on the server; its all right there.
Once the characters are animated, we render out screen capture previews and sit down and do a daily kind of thing. When the shot is approved it goes to rendering. When its rendered, its brought back into edit where it replaces the storyboard. Thats sent to audio for a final mix and sound effects.
BR: Which tools are you using?
KL: For animatics, we use Avid. For artwork, PhotoShop. The storyboards are drawn by hand and scanned in. The rigging is in Max and the layout department works in Max. Pretty much everything from layout on happens in Max. For a very small percentage of shots, we use Combustion or Smoke; out of a given episode maybe 20 out of the 200 or 300 shots need compositing. Almost everything happens in camera.
BR: Would you want to do another season of episodes?
KL: Were hoping, hoping we can. Its really funny stuff the writing and the voice talent its like a Hollywood cast [with the likes of Balaban, Jennifer Coolidge, Paul Dooley, Nora Ephron, Lisa Kudrow, Michael McKean, Martin Mull, Rob Reiner and Paul Reubens].
Barbara Robertson is an award-winning journalist who has covered visual effects and computer animation for 15 years. She also co-founded the dog photography website www.dogpixandflix.com. Her most recent travel essay appears in the new Travelers Tales anthology The Thong Also Rises.