Search form

Patrick Murphy Talks 'Annoying Orange'

The seasoned VFX vet shares how an integrated pipeline built upon Adobe software brings greater creative freedom and efficiency to the production.

At work on Cartoon Network's Annoying Orange. All images courtesy of Kappa Studios.

From web short sensation to hit Cartoon Network TV show, Dane Boedigheimer’s Annoying Orange may be blazing a trail for the future of topical issue - social network driven television productions. Like South Park, Annoying Orange has always been driven by the immediacy of fan reaction to characters and stories as well as from reactions to real-time events in contemporary culture.  Unlike South Park, Annoying Orange blends live action with animation, which requires an even more sophisticated integration of production disciplines to achieve fast, cost effective and high quality episode turnaround. And while complete shows are not yet turned around in a single week, such a production goal is not outside the realm of possibility given the direction the production is currently headed.

One of the key artists helping design, build and refine the show’s production pipeline is VFX Supervisor and Animation Director Patrick Murphy, who along with other talented personnel at Burbank post-production house Kappa Studios, is currently at work producing 40 episodes for the show’s second season.  By building an integrated in-house production pipeline within Kappa, based on key Adobe technology, Patrick has enabled the show’s writers and producers to exercise tremendous creative freedom free from many of the common, costly technological restraints that make this type of production so difficult to sustain.  Patrick’s background as a senior visual effects artist on shows like Avatar, Clash of the Titans and Prometheus has taught him many important lessons about reigning in production costs and implementing proven tech tools to better enable artists to be creative rather than waste valuable time grappling with inefficient pipeline tools and procedures.  I recently had a chance to talk to Patrick about his work on the show, how he has integrated Adobe’s Creative Suite 6 and other tools into his pipeline and how his experience has helped producers make a less expensive and better looking show.

Patrick Murphy.

Dan SartoCan you tell me a bit about the show’s production?  How are you and Kappa Studios involved?

Patrick MurphyAnnoying Orange originated on the Internet as a YouTube sensation.  And because of that one of the things Dane Boedigheimer, the show’s creator, did regularly was create and craft the content of the show in response to what the viewers were saying about the previous episodes.  He setup this social media interaction dynamic that Cartoon Network wanted to continue for the television series.

Furthermore, what that also meant was a lot of the kinks in the pipeline for creating the web series had already been worked out. The goal here wasn’t so much to try and reinvent the wheel, but to enhance it and create a larger world for Annoying Orange and all the other characters.  So from the artistic side, those were two of the parameters that we had to meet to be successful.  Of course, we also had to look at production schedules.  For TV, they of course are longer [than for a webisode].  But that really also means the producers want to have greater production value.  So the biggest thing, when you look at all she show’s different requirements, is whether or not we have all the infrastructure required to pull it off.

One thing that’s particularly interesting about Orange, which is probably the most obvious, is that we have live action mouths that are composited on top of art assets or photography. This is very similar to a show from the 60s called Clutch Cargo, where they used a technique called Syncro-Vox. Syncro-Vox is really just an optical printer designed by this guy Edwin Gillette where I believe they just took actors mouths and put that on top of traditional cel animation.  Whether on purpose or not, Dane essentially was mimicking a process that was born out of the need to produce a show with a very small budget in a short amount of time.

So using the parameters the executive producers gave us, looking at the entire show, I asked the question, “What’s the best way to pull this off?”  And the first answer was that you want to have flexibility.  But flexibility in this particular instance was not just about the animation.  It was about an entire set of production resources, from scripting to pre-production, the onset shoot, the initial post-production, animation, visual effects, editorial, color correction, sound, the whole gamut of all the services required.  The reason we needed such flexibility was that at its core, the show is supposed to grow and change in response to the opinions of the viewers.

Live action mouths are composited on top of art assets or photographs.

So, Kappa Studios in particular has been poised to offer all of those services in one turnkey solution, I know that sounds like some kind of cheesy marketing piece, but it’s true.  We have camera rentals here, we have offline editorial, online services, color correction, duplication, even third party QC in the same building, which is really, really helpful when it comes down to the wire having to deliver a show.

Basically everything is in one house.  So for the producers as well as myself, it allows for a truly organic type of workflow.  When I say “organic,” I mean organic in the sense that animators are getting up, walking away from their desks, going into an edit, talking with an editor, possibly changing the edit, getting it approved by the producers who are across the hall and suddenly, we are off running in a new direction.

I don’t think you can get that organic responsiveness anywhere out there.  Everything requires phone calls, emails, change orders, all that other stuff.  We need to do away with all of that.  Throw all of that away and really have what I call an organic and flexible type of workflow that can respond to the needs of the client, the demands of the network or the demands of S&P, whatever the issue may be.  So that’s what we’ve been building with this pipeline.  We wanted to make sure that we had access to every single aspect of production throughout the production schedule.  That’s one thing that Kappa has really done phenomenally well and that’s ultimately why we’re back here again for Season 2.

DSHow many episodes are you doing for the new season? How big is your crew?

PM:  For Season 1, we did a total of 30 episodes.  On Season 2, we are doing roughly 40 episodes.  On those 30 episodes, every single one was animated in six days.  Animated and composited.  Of course the overall production schedule for a given episode was longer.  There is the live action shoot and the editorial, but the animation and visual effects department would have each episode for a total of about six days.  When I first got on board, it was very evident that the EPs ultimately wanted to have shows that were always topical and dealt with things that were trending in pop culture.  The last episode we did for Season 1 was called The Generic Holiday Special.  They had Weird Al Yankovic, Bret Michaels, Alice Cooper and Maria Menounos.  All these guest stars came in and we never had a script for that show.  I’m a big fan and a big proponent of having scripts, doing breakdowns, analyzing where we can institute efficiencies.  But that was the episode where they broke the mold.  I saw that was probably going to be a trend because the goal there was to see how organic we really could be and how quickly we could respond to whatever was happening on social media.  That’s potentially becoming the template for Season 2.

Igor Ridanovic, Kappa’s lead colorist, working on the show.

So in response to that you might think we’d want to hire a bunch of people, bring on as many resources as possible on a job where we have such production variables.  But my experience has shown that’s actually counterintuitive.  I’ve got five animators, one visual effects editor, two traditional editors, one editing assistant, one color correction person, one sound guy.  That’s pretty much the post side of Annoying Orange. The five animators are people whose skill sets cut across many different disciplines.  They not only are capable of compositing or animating, but a few of them have abilities with 3D packages like Cinema 4D and Lightwave.  On any given day they could do compositing, animating or creating 3D models, whatever the case might be.  I truly have a set of artists who are jacks of all trades.  Big facilities tend to have people who specialize.  There’s someone who puts effects on a shot, or who will light a shot.  There’s another guy who is in charge of rendering it.  I really like people who come from smaller shops, who have always done their own rotoing and tracking and compositing and color correction, because those are the skills, for example on this show, that we value highly.  So everyone here can do multiple tasks.  They do them well and they do them extremely fast.

DSThat’s an unbelievably small crew, I mean for…

PM:  Yes, it is.

DS…that large of a production.

PM:  Yeah.  When I’m onset, there are probably 30 people there.

DSTell me about your production process, the tools you use, this integrated pipeline you’ve built that can animate an episode in roughly a week.

PM:  What I have found to be extremely valuable is that whenever possible I start at the script phase.  We use the Adobe product line throughout the entire process.  Part of that is because in order to facilitate the needs of the show and the constantly changing nature of any particular episode, sharing information is paramount.  Also, creating automation wherever automation can be created is extremely important because it frees up your artists and creative people to do what it is they were hired to do…be creative.

So traditionally, having tools that could automate creating shots, building projects for the compositors and animators, that has always been a task you would assign to an assistant or a junior person.  It’s arduous and takes forever.  Things like that, without a doubt, should be automated.  In the past, I have always hired programmers to do that.  The problem is that there is a fair amount of investment that has to be made before you see any return.

Igor's work is the last stop of the pipeline before digital delivery.

A lot of companies don’t initially see that being a benefit or aiding their bottom line. But in reality, it can mean about a 35% savings when you can automate a lot of these traditional administrative processes. Back in 2002 I was working with Pixel Magic.  Out of curiosity, we did our own internal study of how much time people were spending creating file names, directories and submitting renders with all the information the renderfarm needed.  We found that 35% of our compositors’ time was spent doing those admin type tasks. So I have been an extremely strong proponent of automation, scripting and programming as part of the post-production process.  Traditionally that required a programmer.  However, Adobe is one of the first to start lowering the level of knowledge needed to accomplish those tasks, as well as to have integration across an entire product line.  I’m not just talking about opening a Photoshop file in any one of their products.  I’m talking about embedded metadata. 

So for us, it starts as an Adobe Story.  Our scripts are broken down and tagged.  We actually tag particular items in script sets as being visual effects related, production related, prop department related, etc.  For me, it gives an idea as to who is responsible for what.

Subsequently, we meet and talk about what the prop department should be building or whether a visual effect should be used. That allows us to focus all of our different talents appropriately.  For example, we may realize very quickly that we don’t need to do make up on Toby [Turner] that will take three hours in the middle of a production day.  We might be able to accomplish a similar task with visual effects.  Though it may take one artist three days [in visual effects], it’s not going to force 30 people onset to sit idle for three hours.  When you start running the numbers on that it doesn’t make any sense.

We use those breakdowns in Adobe Story to define what every department’s task and responsibilities will be in producing that episode.  In addition, there is other data we can extrapolate from the scripts.  My big thing is that I want that data to be carried through the entire production process so you don’t have to rely on a paper trail or have an assistant that’s always grabbing files.  That doesn’t make any sense.  By embedding xml data into clips or sidecar files and getting editors access to that information, we really create a much more conducive information pipeline. That is one of the underpinnings of the entire production process.

Patrick is always trying to find production consistencies that can be standardized and put into libraries that editors can easily slip right into an episode.

The next piece, if you look at the breakdowns, is that we know the characters are always going to be on a cart.  There are always establishing shots of the cart.  We know that Orange is always going to do a motorboat where he goes, “Motorboat blblblblbbl!” We know that Grapefruit always get angry.  So, we always try to find consistencies, whether character based, story based or environment based.  Those become areas where we want to create standardization.  So we started building libraries.  We have a huge growing library that consists of the store environment, which is where most of the Orange story takes place, cart shots from every conceivable angle, as well as mouths and eyes shot on the RED camera.

For Dane Boedigheimer, we have recorded every expression known to mankind, with tracks stabilized, rotoscoped out and color corrected, ready to drop on top of our character assets.  That makes it really nice for editors, for example, to insert a cutaway shot where all the characters are reacting to something Nerville says.

The benefit is that once the editor makes those selects, those shots are complete.  There is nothing else for us to do.  They are part of a group of library assets that are already pre-positioned and essentially pre-composited.  An editor can go ahead and select a background of all the different cart options, take the templates for the characters, drop them on top of the cart and then make selections for the expressions he wants them to have.  That represents as much as 10 to 15% of a show for us on the animation side.  It’s considered done.

Next we have all the dialogue, which changes from episode to episode.  The nice thing about the Adobe line is that since that’s shot with RED cameras, there is no transcoding required.  Prior to using RED, we’d done tests and the transcoding was prohibitive, primarily because the directors like to use long takes.   Some takes are 20 minute long. There is a lot of adlibbing.  With Toby Turner, who is Nerville, we’ve been doing 12 minutes takes onset where he just rambles on, and on and on.  Trying to transcode all that material would never happen.  So the ability for Adobe to use the RED camera natively has been a huge plus for us.

Recently, I’ve been looking very strongly into using Prelude, because when you have a 12 minute take, how do you identify what part the director likes? He may say he likes somewhere around minute seven.  But by the time you get to post, if you have editors who now have to go through and watch all of that material, it becomes extremely time consuming.  Once again, if you can tag things with metadata, there is a huge benefit.  It allows the director’s to free form and freestyle, whether in a video session or onset, but still have the structure that’s required for post to be handled efficiently. 

We use After Effects for the animation and compositing.  Premiere is our offline and online editing tool.  We use dynamic linking extensively and have a whole suite of scripts written both internally for After Effects and externally for Premiere that manage all of our shot creation as well as getting shots and dropping them back into edit.  Those are all automated and require no human interaction whatsoever.  Once my artists are done with a shot, it gets dropped into the edit automatically.  They can walk over to Premiere, hit Play and watch their shot in the context of the edit.  Doesn’t involve anyone, I don’t have to get an assistant to copy shots over. I don’t have to get an assistant to drop them into the edit.  It’s all transparent.

DSAs you describe it, you’ve built an extremely integrated pipeline handling many varied facets of the show’s production.

PM:  It’s all interlinked.  The big thing on the show is that although I’m the VFX Supervisor and Animation Director, I am sitting in on meetings with the writers.  The nice thing about the EPs on the show is that they want to know if something’s feasible.  Is this episode producible?  And we have the ability to influence those decisions.  My goal is not to change the story, but to facilitate it by saying, “Well, does the voice of the story stay intact if we do it this way instead?”

DSYour pipeline carries the director’s and other key people’s inputs all along the way.  Editorial decisions at that level impact every other piece of the production from that point forward.

PM:  Exactly.  Everyone’s decision before you has an almost exponential impact on the people ahead of you.

Adobe® After Effects® CS6. Image courtesy of Adobe.

DSRight.  Back to the tools you use…

PM:  Sure.  I would say that Lightwave and Cinema 4D were not part of Season 1, but they will be part of Season 2.  As far as specific tools, yeah, we use Story extensively.  I'm a huge fan of Adobe Story.  I think it’s the only product out there right now that fully integrates the scripting and screenwriting side with the scheduling side.  I have EP Movie Magic, EP Budgeting and EP Scheduling, but those are all discrete products that require exporting and importing for integration, and quite honestly I don’t have time for that.


PM:  So starting with Story is really important.  All key personnel from every department are accessing the Story project.  We’re now using Prelude onset for our logging and to annotate certain takes from our directors.  Then we have that main media delivered to Kappa, where our editors and assistants put together rough assemblies in Premiere.

We also have all this voiceover data coming in, these eyes and mouths.  All that stuff needs to be prepped for the animators.  By prepped, I mean stabilized and rotoscoped out.  We use mocha through After Effects.  The After Effects work stabilizer does a phenomenal job at keeping people’s heads still.  Many of the actors and comedians gesticulate a lot behind the camera. We want them to have that freedom because it affects their performance.  But in the end their head needs to be locked down.  So the warp stabilizer has been phenomenal.  Then we use the MochaImport Script in After Effects to get all our masks.  We do that for every single character.  For every show there are roughly 40,000 frames that have to get prepped.  We have two people take care of all that.  Though they scream through it, this year it’s one area I feel needs to be automated.  It’s an arduous, tedious task that is begging for computational analysis and execution.

DSIt’s begging for change.

PM:  Begging for change indeed.  Currently we have to wait until there’s a storyboard cut before we tackle it.  Otherwise they’d be doing 97,000 frames and that’s prohibitive.  Right now there’s a short time between the edit and delivery to the animators.  These guys have to crank out 47,000 frames over roto stabilization on every single episode.  So, I would like to alleviate that stress, take those people, move them elsewhere within the production process and be able to do more. So why don’t we write something that does it for all of the raw footage before we even have to edit and wait? 

I was mentioning about cost and the idea of being able to quantify all the resources required for a show.  Artists want to create content that they’re proud of.  Not all artists understand that ultimately there’s a dollar sign behind how much time they spend on a particular shot.  For an artist, I think that needs to be transparent.  They need to feel as though they have the opportunity to do the best work possible.  So by alleviating all the other redundancies, the tedious little tasks, by basically trying to share information in a most efficient way, it frees up a lot of time and gives artists more breathing room.  To me that’s particularly important.

Kappa Studios in Burbank, California.

DSYou’ve had an interesting career path, an interesting trajectory that’s taken you from heavy duty feature film visual effects to Annoying Orange. 

PM: You’re right.  When they approached me I had this moment of pause where I thought, “Do I really want to work on this show?” I was looking like at, “Man, really, do I want to go from finishing up on Prometheus to working on Annoying Orange?” Most visual effects guys are all about chasing titles.  I understand that.  It’s kind of cool for you to see your name six feet long, all luminous at the end of the latest greatest visual effects film.  But my approach over the last five years has been how do I get myself out of the service side of things, because that is a race the bottom.  I want to know more about the whole entire process of the business.  For me, I’ve always been looking for opportunities where they allow me to have more exposure to those areas on a particular production. I get that on this show.

To me, it’s been a phenomenal opportunity to once again play around with traditional pipelines and procedures for creating content, turn it on its head a little bit so that we can do it differently in a more cost effective way.  Like I said in the beginning, the key is flexibility and flexibility means control over the infrastructure.  That’s one thing we’re doing and that to me has been a lot of fun.

DSI would imagine your feature film production skills come in quite handy on this show.

PM:  When it comes down to it, one of the best things I bring to the table is knowing when and when not to speak up.  The best example of this is that you can be on a set and you realize that a green screen or something is not perfect.  It’s going to require extra time for you to key that out.  But people forget its only one artist’s time. Maybe that person makes $20, or $50 an hour.  Even if it took them a week, 40 hours, to fix the issue that you’re witnessing onset, how long does it take for the whole crew to stop and make adjustments?  It is going to cause 100 people to sit around for an hour?  Does that make any sense?  Absolutely not.

There is some validity behind the phrase, “We’ll fix it in post.”  I would say typically, yeah, it really is cheaper to just address it in post. However, there’ve been many, many times on Orange where I’ll ask for something because it would only take one minute for one guy onset to fix something and now, 40 shots don’t have the problem I saw.  That’s a different scenario, when I look at something and go, “Well, this should be addressed onset because this is going to proliferate across an entire episode.”  So, it’s about knowing when to speak up and when not to.

Patrick stresses that the key to an efficient production pipeline is flexibility, and flexibility means control over the infrastructure.

DSLast question.  What would you say are the biggest challenges you face on this production?

PM:  To be honest, it’s my artists.  They’re phenomenal, they’re really good. But they sometimes have issues with an organic production process.  I first saw it happening on Avatar where people were getting really, really frustrated that the edit was constantly changing.  Up until the last few years that wasn’t usually the case.  You usually had a locked edit and then you’d start with the visual effects.  But more and more frequently the edit is constantly changing.  It requires that the visual effects be able to respond accordingly. On Orange, it’s not the edits that are changing, it’s the scripts that are changing.  Or, while they’re shooting onset, they’re making decisions on the fly.  That’s because one of the writers, Tom Sheppard, he riffs constantly.  Kobe Turner will just riff off the cuff and quite literally story lines will change at a moment’s notice.

So for me, the difficulty is that sometimes my libraries aren’t relative and then my artists react, “Oh! What are we going to do? We have to do all this now…” and in the end, they’re not spending any more time than they normally do.  There needs to be more education among the visual effects and animation communities.  They need to have bit more of a worldly perspective.  We’re all kind of introverts.  We sit at our computers a lot.  It would help us out, especially given the recent situation with the whole visual effects union and business issues to have a more worldly perspective of the process that they’re a part of.

To answer your question in a different way, I think the other issue would be that the show is organic, I mean extremely organic.  I have a very good inkling that they want to be able to do these shows in about a week, from beginning till end.  Very much along the lines of South Park.

DSThat’s not a lot of time for a production.

PM:  We’re getting closer and closer.  They want to have very topical episodes. They ultimately want to mimic the web series.  Dane threw Marshmallow in on one of the webisodes and he got a tremendous response the following day.  We love Marshmallow!  Well, that’s how Marshmallow became a character.  Same thing for Apple and same thing for Grapefruit.  They want to be able to have that type of social media dialogue with the viewers.

So that means we need to truncate even further our production schedule, not just the animation but the entire thing from beginning to end.  I look at that and I firmly believe that’s possible.  If we can create more standardization in some things and have the editors able to make selects from backgrounds, grab characters, grab a more extensive library of dialogue or reaction shots or whatever the case is and assemble that during the editorial process, I feel it’s really feasible.

Trying to always respond positively and empower the producers, that’s really why I’m here.  I want to be more a part of the process of content creation as opposed to just servicing needs.  I want my artists to have a great experience as well.  So far we’ve been successful. Everyone who was here last season, after a four month break, they all came back. They all quit other jobs to come back. That says a lot.

DS:  Right.  Do you see shorter and shorter production timeframes becoming more common for TV animation or do you just think it happens to be the nature of the Annoying Orange show?

PM:  It seems like many shows are going that direction, not just animated shows but even live action dramas.  Those typically seem to be driven just by pure cost.  I feel like Orange and some other shows are trying to do something more by interacting with their viewers. I keep waiting for “choose your own interactive adventure” television, but done the right way.   I think that’s where it’s going to go, where you have online content and social media content that is directly related to the next episode or the episode you just watched.  To me that’s particularly attractive. I feel like Orange is a good model to use to help figure out how you make all that feasible.


Dan Sarto is editor-in-chief and publisher of Animation World Network.

Dan Sarto's picture

Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.