Deep Learning and AI will certainly continue making inroads into VFX production, but they are merely tools, there to be our creative servant, not our master.
I’m not one for tattoos. At least I thought I wasn’t. While working with a few freelance flame artists over the last couple of months, we were reflecting on the similar paths of our careers. Then it struck us. Maybe we would be the only generation to spend our entire careers doing the jobs that we do. Why? Partly because anyone with a techie and artistic bent is going to be interested in working at Google, Instagram, Tesla or SpaceX, not making TV commercials. And because the robots are coming.
Instagram’s filters can 3d track and composite in real-time to make you look like a kitten, a Parisienne, Cupid or Jesus. Machine Learning has enabled the creation of Deepfake videos so that no-one really knows what’s real anymore (which, coming from someone who has made their fair share of television commercials over the last 25 years, is saying something). 2016’s Morgan trailer was cut by IBM’s Watson computer. Adobe’s audio editing and generating prototype software, Project VoCo, debuted in 2016, promising the capability of replicating someone's voice entirely after listening to 20 minutes of the desired target’s speech.
At the start of the digital post-production era, for example, cutting out a foreground live-action element not shot on a greenscreen could be difficult. The workstations capable of doing this kind of work were expensive and the people with the skills to achieve a seamless final result commanded high salaries. To make the process more cost-effective, post-houses started training up juniors and putting them on the night shift working on the roto.
Then the outsource roto companies arrived on the scene, which meant the roto would now be sent out overnight. When it came back, it would be pretty good most of the time. Now, Machine Learning is heading towards enabling your existing workstation to create decent roto all on its own. Thousands of hours of footage have been analyzed and it knows what a human figure looks like, what is moving in the shot and at what depth. We all need mattes and really, we don’t care how we get them. The idea that a computer process creates them doesn’t really matter to us; we just know we need the mattes.
How does this all work? To understand it better, simply ask Siri, Google or Alexa. We are surrounded by Machine Learning. Those ads that creep you out a little bit because they advertise things to you that Google thinks you are interested in are being created through Machine Learning. Categorization of you and your interests allows advertisers to hyper-target. The list of datasets is huge, ranging from your level of education, your age, family composition, date of birth, your connections, your political views, relationship status, industry sector, purchase history, online interactions, what kind of live events you attend, the movies you watch, the music you listen to, what books you read, what you eat, what kind of travel interests you, the vehicle you drive, what clothes you buy, favorite sports teams, your primary browser, your operating system...the list goes on. There’s so much information available that goes into making up a picture of you and your lifestyle, that companies collecting and accessing that data now has a pretty good idea of what you might be interested in engaging with and possibly purchasing.
Siri, Alexa and Google are Machine Learning. They get better over time. When they don’t understand you, that experience helps it to understand you better the next time. Scale that up from answering your questions to answering all of the questions that anyone asks anywhere in the world and it becomes clear how much data becomes available for Siri to evaluate and learn from. The algorithms that drive Siri adapt and improve their performance over time as the number of samples increase.
But does Machine Learning technology mean the end of artistry? Will editors, designers and visual effects artists still have a place in the future? Could a machine edit a movie and successfully enhance and express the emotion of the story? Does a machine have the ability to create without any human input? Are directors and writers next? The answer to all of this is, most likely, no. The enhancement of human artistic and creative efforts by technology has been a reality since the first cave dwellers decided that drawing pictures on a wall using a stick was better than just using their fingers. Technology is there to be our servant in the creative process, not our master.
The emergence of photography didn’t eliminate painting. Television hasn’t yet killed cinema. New art forms find their own audience; that doesn’t destroy the old one. Gaming is expected to be worth $152 billion in 2019, but that hasn’t stopped television and film from exceeding $100 billion for the first time. Machine Learning is helping the creative process by removing the labor-intensive and tedious processes that no-one really wants to do -- the things that hold up the creative flow.
And this provides a big clue as to why Machine Learning, deep learning, and artificial intelligence have been making inroads into visual effects. Computers and networks have become faster, bigger and more efficient and, at the same time, cheaper. So deep neural networks (that mimic the way that the brain works) can be affordably created. Then, they can be fed enough data enable quicker pattern recognition and creation of elements that recognize those patterns. Deep Learning and AI will certainly make inroads into the work we do, but they are tools, they are pieces of the puzzle that is visual effects. It’s very rare in VFX to do the exact same thing over and over again. Each shot has its own problems and solutions. Experience tells you which methodologies might prove helpful. There might be some new tools in your box, but the tools aren’t going to do anything by themselves.
We all have to keep moving and developing to keep things fresh and interesting. As one of the lucky few who has happily made a living by sticking with Flame throughout their career, I decided the venerable system deserved some commitment from me. I got the tattoo.