Zachary Dixon and IV Studio produced four video game-style versions of top athletes using game engine technology, delivering the first in just two weeks.
For Zachary Dixon, co-founder and executive creative director of IV Studio, the opportunity to work Nike and House of Hoops on a set of four commercials came with a firm, almost ridiculously impossible project schedule, based on an upcoming shoe release date that could not be moved: first spot in two weeks, remaining spots in two more. The creative mandate, to produce four heart pumping, video game-style versions of top Nike athletes, named Nike Avatars, and get them done yesterday, lent itself to animation development using real-time game engine technology and production tools. So, Dixon decided on Unity. He discussed the project at length with AWN. You can watch a description of the project, as well as the four spots, below.
AWN: How were your creative planning and pre-production different for these four Nike spots, working with a real-time Unity workflow, than if you’d used your traditional animated commercial spot development workflow?
Zachary Dixon: The biggest difference on this project in planning was the compressed timeline. We had to jump straight from the storyboards we pitched into a full fidelity first look in just a couple weeks. This skipped quite a few steps from our normal process: we’d usually show an animatic and some early looks at animation to the client well before moving on to a full fidelity pass. Working in Unity allowed us to work on developing the style simultaneously while character animation was happening. We then were able to bring the animated characters into Unity within a few hours of our deadline and quickly render out the full spot to show the client. At the moment we’d usually be showing our pre-production pass to a client, we were able to show them the spot in its nearly final form, and work with them to make tweaks from there.
AWN: What are your main concerns, both creatively and technically, when considering animated commercial production with a real-time workflow?
ZD: Staffing can be quite challenging when working in a real-time pipeline for commercial animation. Game development and animation are still somewhat separate worlds, so it’s proven difficult to find people who can dip into both sides and bridge the gap between the two skill sets.
AWN: You produced the spots in just a few weeks. How does that compare to a normal, or traditional, schedule, producing the same spots in a similar animation style but with traditional CG tools?
ZD: This timeline was blazing fast, and honestly faster than we’d ideally like to take on again. But we had a shoe release deadline that these needed to coincide with, which was un-moveable. Our first draft was due in just 2 weeks, with the final delivery of the first spot in under 4 weeks. This is at about a third of the time we’d normally take for a character heavy spot.
AWN: How does the Unity real-time toolset enable your creativity on spots like this? How was it beneficial in ways other than just we could make things happen much quicker?
ZD: As a director and often-times compositor, Unity is such a fun creative playground. For us it acts as this central hub for all our assets. We’re pulling in assets created by our team in Maya, Zbrush, C4D, Substance, and Photoshop. Once all the pieces are there, you have so much freedom to quickly iterate and experiment. As a director, there are big decisions I often wish I could go back on: “What if we played this shot in silhouette?” “What if we used a 50mm here instead of a 35mm lens?” In a traditional pipeline, these ideas are generally out of the question at a certain point. You’d have to go back too many steps and pull a bunch of people off what they are working on to make it happen. But when you’re working in a real-time pipeline, you can just try stuff and see if it works. Having that immediate feedback and easy access to changes is crucial to experimentation, and that experimentation ultimately leads to a better end product.
AWN: What were the biggest challenges on the project?
ZD: The biggest challenge on this project other than the quick turnaround was nailing the appearance of each athlete while maintaining the stylized qualities we were aiming for with our chosen art direction. The avatar likenesses had multiple approval stages, including the athletes themselves. It was incredibly important for us and the agency to make sure the players were happy, but also that fans would be able to easily recognize the players. We were making updates to the players appearance throughout production, sometimes right up until the day of delivery. Fortunately, our pipeline on this project allowed for that swap to be made in just a couple hours.
AWN: Would you recommend a similar technical / creative solution for all you future commercial clients?
ZD: We subscribe to the idea of choosing the right tools for the job. Using Unity and a real-time pipeline was definitely the right choice for this job, but it’s not right for everything. For now, we’ll be looking towards our real time pipeline in these scenarios:
First, when we have a tight deadline. This one is the most obvious. There is certainly some time to be picked up when you don’t have to render, but it’s also not a catch all. We still need to model, rig, texture, and animate, which all still take a considerable amount of time.
Second, when the art direction can be crafted to the medium. This Nike project was a great example of this. The brief from the agency essentially called for us to “put these players in a video game.” We love Unity’s ability to get that Windwalker-esque toon shading (via Toony Colors Pro). At this point, we try to choose an art direction that is heavily stylized and somewhat flat when working in a real-time pipeline. I do see this changing though as real-time ray tracing continues to mature.
Third, when we have a lot of different deliverables. Over time the list of deliverables on our projects continues to get larger and larger. Whether it’s different resolutions for various social platforms, or breaking out projects into shorter segments, we’re seeing a huge need for quantity alongside quality. We’re also seeing a need for greater customization among deliverables, where we may deliver different versions of videos with slightly different copy, colors, and even entire shots. Unity and other real-time platforms are setup really well for quickly creating and outputting these ever-expanding lists of resolutions and variations within projects.
And finally, when there is an interactive element. Another way we see our client requests changing is asking for interactive experiences alongside a traditional animation project delivery. This could be a VR experience, AR stickers, or even simple game-like interactive experiences. Unity is the perfect place to craft these projects because you can create video and interactive projects in the same environment at the same time.
AWN: Does this type of animation production lend itself mostly to game-like animated characters / environments?
ZD: Not necessarily! There are some great examples like The Heretic or Baymax Dreams that are pushing into some really high fidelity visuals. We’ve got a few projects in the works as well that push into other styles, but sadly nothing we can show just yet. I’m most excited about the potential for highly stylized animations that can come out of a real time pipeline… I think we’re just getting started.
Dan Sarto is Publisher and Editor-in-Chief of Animation World Network.