Patrick Murphy Talks Annoying Orange
For Dane Boedigheimer, we have recorded every expression known to mankind, with tracks stabilized, rotoscoped out and color corrected, ready to drop on top of our character assets. That makes it really nice for editors, for example, to insert a cutaway shot where all the characters are reacting to something Nerville says.
The benefit is that once the editor makes those selects, those shots are complete. There is nothing else for us to do. They are part of a group of library assets that are already pre-positioned and essentially pre-composited. An editor can go ahead and select a background of all the different cart options, take the templates for the characters, drop them on top of the cart and then make selections for the expressions he wants them to have. That represents as much as 10 to 15% of a show for us on the animation side. It’s considered done.
Next we have all the dialogue, which changes from episode to episode. The nice thing about the Adobe line is that since that’s shot with RED cameras, there is no transcoding required. Prior to using RED, we’d done tests and the transcoding was prohibitive, primarily because the directors like to use long takes. Some takes are 20 minute long. There is a lot of adlibbing. With Toby Turner, who is Nerville, we’ve been doing 12 minutes takes onset where he just rambles on, and on and on. Trying to transcode all that material would never happen. So the ability for Adobe to use the RED camera natively has been a huge plus for us.
Recently, I’ve been looking very strongly into using Prelude, because when you have a 12 minute take, how do you identify what part the director likes? He may say he likes somewhere around minute seven. But by the time you get to post, if you have editors who now have to go through and watch all of that material, it becomes extremely time consuming. Once again, if you can tag things with metadata, there is a huge benefit. It allows the director’s to free form and freestyle, whether in a video session or onset, but still have the structure that’s required for post to be handled efficiently.
We use After Effects for the animation and compositing. Premiere is our offline and online editing tool. We use dynamic linking extensively and have a whole suite of scripts written both internally for After Effects and externally for Premiere that manage all of our shot creation as well as getting shots and dropping them back into edit. Those are all automated and require no human interaction whatsoever. Once my artists are done with a shot, it gets dropped into the edit automatically. They can walk over to Premiere, hit Play and watch their shot in the context of the edit. Doesn’t involve anyone, I don’t have to get an assistant to copy shots over. I don’t have to get an assistant to drop them into the edit. It’s all transparent.
DS: As you describe it, you’ve built an extremely integrated pipeline handling many varied facets of the show’s production.
PM: It’s all interlinked. The big thing on the show is that although I’m the VFX Supervisor and Animation Director, I am sitting in on meetings with the writers. The nice thing about the EPs on the show is that they want to know if something’s feasible. Is this episode producible? And we have the ability to influence those decisions. My goal is not to change the story, but to facilitate it by saying, “Well, does the voice of the story stay intact if we do it this way instead?”
DS: Your pipeline carries the director’s and other key people’s inputs all along the way. Editorial decisions at that level impact every other piece of the production from that point forward.
PM: Exactly. Everyone’s decision before you has an almost exponential impact on the people ahead of you.
DS: Right. Back to the tools you use…
PM: Sure. I would say that Lightwave and Cinema 4D were not part of Season 1, but they will be part of Season 2. As far as specific tools, yeah, we use Story extensively. I'm a huge fan of Adobe Story. I think it’s the only product out there right now that fully integrates the scripting and screenwriting side with the scheduling side. I have EP Movie Magic, EP Budgeting and EP Scheduling, but those are all discrete products that require exporting and importing for integration, and quite honestly I don’t have time for that.
PM: So starting with Story is really important. All key personnel from every department are accessing the Story project. We’re now using Prelude onset for our logging and to annotate certain takes from our directors. Then we have that main media delivered to Kappa, where our editors and assistants put together rough assemblies in Premiere.
We also have all this voiceover data coming in, these eyes and mouths. All that stuff needs to be prepped for the animators. By prepped, I mean stabilized and rotoscoped out. We use mocha through After Effects. The After Effects work stabilizer does a phenomenal job at keeping people’s heads still. Many of the actors and comedians gesticulate a lot behind the camera. We want them to have that freedom because it affects their performance. But in the end their head needs to be locked down. So the warp stabilizer has been phenomenal. Then we use the MochaImport Script in After Effects to get all our masks. We do that for every single character. For every show there are roughly 40,000 frames that have to get prepped. We have two people take care of all that. Though they scream through it, this year it’s one area I feel needs to be automated. It’s an arduous, tedious task that is begging for computational analysis and execution.
DS: It’s begging for change.
PM: Begging for change indeed. Currently we have to wait until there’s a storyboard cut before we tackle it. Otherwise they’d be doing 97,000 frames and that’s prohibitive. Right now there’s a short time between the edit and delivery to the animators. These guys have to crank out 47,000 frames over roto stabilization on every single episode. So, I would like to alleviate that stress, take those people, move them elsewhere within the production process and be able to do more. So why don’t we write something that does it for all of the raw footage before we even have to edit and wait?
I was mentioning about cost and the idea of being able to quantify all the resources required for a show. Artists want to create content that they’re proud of. Not all artists understand that ultimately there’s a dollar sign behind how much time they spend on a particular shot. For an artist, I think that needs to be transparent. They need to feel as though they have the opportunity to do the best work possible. So by alleviating all the other redundancies, the tedious little tasks, by basically trying to share information in a most efficient way, it frees up a lot of time and gives artists more breathing room. To me that’s particularly important.