Search form

Key Feature Film Contenders for the 2021 VES Awards

This year’s batch of animated and VFX-driven feature films shows that despite the pandemic, artists around the world continued pushing the boundaries of visual storytelling.

The 19th Annual VES Awards are virtual this year, streaming worldwide on April 6/7 depending upon your location. With productions hampered by COVID-19 pandemic lockdowns and restrictions, VFX studios and artists from around the globe found ways to quickly pivot into safe, remote production environments as they continued pushing visual storytelling boundaries by through state-of-the-art visual effects.

VES Award nominations for Jingle Jangle: Christmas Story, Project Power, The Midnight Sky, Da 5 Bloods, Mank, Extraction and News of the World means that only three contenders for Outstanding Visual Effects in a Photoreal Feature and Outstanding Supporting Visual Effects in a Photoreal Feature weren’t either produced or distributed by Netflix.  Pixar Animation Studios’ Soul leads the feature nominations with five, with Onward following closely behind with four. Breaking away from the usual tradition of honoring live-action and animated studio features is the appearance of HBO’s documentary, Welcome to Chechnya, which utilized machine learning to digitally protect the identities of their interviewees. No clear live-action favorite exists, while Soul appears to be leading the pack so far with award season voters. Profiled below are some of the top feature film nominees across various live-action and animation categories.  


Nominations: Outstanding Visual Effects in an Animated Feature; Outstanding Animated Character in an Animated Feature; Outstanding Created Environment in an Animated Feature; Outstanding Virtual Cinematography in a CG Project; and Outstanding Effects Simulations in an Animated Feature

Pete Doctor and Kemp Powers’ animated feature tackles abstract ideas like what a soul and the afterlife look like, something that the directors artfully embraces as storytelling elements.  “Terry was one of the more challenging characters to articulate,” remarks VFX supervisor Michael Fong.  “Prior to Soul our tools and workflows were mainly designed to articulate and animate characters with skin and bones.  In contrast, Terry and her fellow counselors were envisioned as a single ‘wire’ of energy that twisted and contorted to approximate a vaguely humanoid shape.  This wire could twist itself into extra arms, a paperclip hand, or even a pedestrian walk sign.  Our Character team built Terry from multiple NURBs curves with hundreds of controls points.  They then needed to build a layered control system to make the animation manageable.  Another large challenge we faced was designing You Seminar.  We knew Pete Docter wanted an ethereal and impressionistic world, but it was challenging to figure out what just what this meant.  It took our Sets team a lot of iteration, and a lot of head scratching, to discover that we needed to build the world from various volumetric elements and particles. These non-typical building materials allowed us to suggest forms and shapes in a way that hadn’t been seen before.  Our Compositing team then applied virtual skrims to every You Seminar shot to further push the ethereal and impressionistic feel.”

More on AWN:

Pixar’s ‘Soul’ Skillfully and Honestly Embraces Ethnic and Cultural Diversity

Designing the Illusory: Souls, Counselors, and The Great Before of ‘Soul’

Pixar’s ‘Soul’ Deftly Tackles the Risky Constructs of Life, Death, and How We Enter the World


Nominations: Outstanding Visual Effects in an Animated Feature; Outstanding Animated Character in an Animated Feature; Outstanding Created Environment in an Animated Feature; and Outstanding Effects Simulations in an Animated Feature

In Dan Scanlon’s animated feature about two brothers racing against time to see their deceased father one last time, “the character of Dad is unlike any character we have brought to life before, consisting of only a pair of legs and a stuffed jacket,” shares, supervising technical director Sanjay Bakshi.  “A lot of the comedy in the film derives from other characters misinterpreting Dad’s intentions based on the movements of the stuffed top.  To achieve believability for this half-depicted character, different parts of the upper body were simulated and animated at the same time. Dad’s jacket stuffing was modeled with a tetrahedral mesh and was simulated.  The hand-animated performance of the legs could drive the simulation of the upper half. After the simulation was computed, animators could then edit the results to push the entertainment.  In this way, a believable but entertaining performance was depicted.  The High School Dragon was a collaboration between Rigging, Animation, and Shading to get the performance and look required of the monster.  The walls of the school were fractured procedurally and the larger broken pieces were hand animated.  Volumetric red curse gas was simulated in the core of the Dragon and ‘holds’ the pieces together.  A rigid body simulation was used to add secondary motion to the High School Dragon.  And of course, our dragon can breathe fire and fly.  Many technical artists and animators worked closely to bring the High School Dragon to life.”

More on AWN:

Dan Scanlon Talks Elves, Magic, and How to Make a Film with Legs


Nominations: Outstanding Created Environment in a Photoreal Feature; Outstanding Effects Simulations in a Photoreal Feature; and Outstanding Compositing in a Feature

In bringing the live-action adaptation of Disney’s 1998 animated feature Mulan to the big screen, VFX supervisor Sean Faden was responsible for producing 2,043 visual effects shots; his main vendors were Weta Digital, Sony Pictures Imageworks, Image Engine, and Framestore.  One of the production’s major environment builds was the Imperial City, which involved scanning the recreation of Chang’an, the capital of China during the Tang Dynasty, at Xiangyang Tangcheng Film and Television Base.  “That was the largest LiDAR scan Weta Digital has ever done,” states Weta Digital VFX supervisor Anders Langlands.  “250 square metres.  What we wanted to do is use the LiDAR and textures photography we had taken to build up a little construction kit of modular building pieces that we could then use to create a huge variety of buildings.  The R&D team did a lot of work on upgrading our instancing system to more efficiently handle method instancing, and that became important for how we approached the city build. Everybody involved with the project had a deep respect for the history and culture of the period we’re trying to tell the story in.”

Over the Moon        

Nominations: Outstanding Visual Effects in an Animated Feature; Outstanding Animated Character in an Animated Feature; and Outstanding Effects Simulations in an Animated Feature

Known for designing some of the most iconic characters in Disney animation history, Glen Keane applied that expertise to his feature directorial debut, Over the Moon, for Netflix.  “Eusong Lee was doing some experimental design work and presented an approach to doing Chang’e that when I looked at it, I asked, ‘Can we do that?’  This is the sacred goddess of China.  Everybody knows about Chang’e on the dark side of the moon. Lee had these wild designs of her being nine feet tall with smaller hands and feet.  I said, ‘Lets make her like a glamazon.’  Everything is going to be bigger than life for her.  All of the same emotional components that we put into Fei Fei’s face became the foundation for Chang’e.  In a lot of ways, Chang’e became what Fei Fei was going to become if she did not learn to love somebody new.” 

Simulations used in producing the environment on the Moon were impacted by the story point that Lunaria and its inhabitants are made from the tears of Chang’e. “A lot of the characters and buildings look like gelatinous gel,” notes VFX supervisor David Smith.  “Many of them have these little tear-offs that work like lava lamps. It’s subtle on the Moon Lions and Space Dogs but bigger on the Lunarians.  The tractor beam has that particulate in it as well.  This is a way of showing that not everything on Lunaria is quite fully-formed.” 

More on AWN:

Glen Keane Talks Believing the Impossible in ‘Over the Moon’

Glen Keane’s ‘Over the Moon’ Artfully Illustrates Healing from Grief Through Play

The Midnight Sky

Nominations: Outstanding Visual Effects in a Photoreal Feature and Outstanding Model in a Photoreal or Animated Project

Spaceships are a sci-fi staple, and for Netflix’s The Midnight Sky, the Aether takes a starring role; its production was part of the over 1,400 visual effects shots produced by Matt Kasmir and Christopher Lawrence.  “Framestore Art Director Jonathan Opgenhaffen spent six weeks down at Shepperton with production designer Jim Bissell,” remarks Lawrence.  “We had developed these kits of things that we could use to make a spaceship and he was able to bring that along.  Jonathan was able to imbue this half a kilometre long large ship with a lot of specific small details which helped to sell the scale; that assisted us to get the designed signed off quickly and get people to believe in it.  Jim was talking about was using present day technology for the core of the ship and more futuristic assembled in space 3D printed design language for the habitation quarters.  It was fun for me because it was drawing upon current SpaceX, NASA and JPL tech and taking it to a new place with things that were plausible but not out there in reality.” 

More on AWN:

Framestore Takes to the Stars in George Clooney’s ‘The Midnight Sky’


Nominations: Outstanding Effects Simulations in a Photoreal Feature and Outstanding Compositing in a Feature

It’s not surprising that water simulations played a huge role in the storytelling and 1,300 visual effects shots produced for the World War II naval battle featured in Greyhound. Production VFX supervisor Nathan McGuinness was extremely careful to map out the warship’s convoy journey across the North Atlantic to ensure continuity.  “I had a basic simulation of the boats at scale,” he explains. “You could come up on a bird’s eye view with one camera and see where the convoy or Greyhound was.  Then I would drop the camera back down into where it was supposed to be in the part of the movie.  I had to keep dropping back in keeping in mind that we were using tactics to avoid the Wolf Pack.  I had to always go, ‘Where is the moon or sun now?’ We also had to keep in mind that there was a destination point.  DNEG was dependent on my previs being produced with Day for Nite. I was building the story and everybody who needed to be part of the filmmaking process were around us.  I would refer back to marine specialist Gordon Laco who knew the ships and U-boats of that period, and had a strong understanding of the sonar and its capabilities at the time.”   


Nomination: Outstanding Visual Effects in a Photoreal Feature

In Christopher Nolan’s reverse-action sci-fi thriller, one overriding goal was finding in camera solutions where possible, which included blowing up a decommissioned 747.  “Early on in pre-production we would sit around and talk about what we would do for each of the events and how best to achieve that,” explains production VFX supervisor Andrew Jackson.  “The first things you would consider are things like a miniature shoot or CG plane and set build or CG explosions. Reading the script, you realized that we needed a plane to shoot all of the scenes leading up to that point [where it blows up].  We bought a wrecked plane at Victorville Aircraft Graveyard and filmed it there because they have a runway.  We built a set quite a long way in front of one of the real hangers.  We cleaned up the plane, built a fake carpark with cars in it, ploughed through the carpark and into this set; that was almost completely in camera.  We put in some CG trees that get sucked in the front of the engine and get blown as we pass them.” 


Nomination: Outstanding Supporting Visual Effects in a Photoreal Feature

Harkening back to the classic black and white Hollywood films of the 1930s is Netflix’s biopic, Mank, which features 753 visual effects shots overseen by co-producer and VFX producer Peter Mavromates.  “The only things shot in color were people on blue as elements for a sequence that we ended up dropping.  It means that there is more manual work for rotoscoping. When you look at Marion Davies [Amanda Seyfried] when she is at the stake and the angle is shooting up, it was a clear blue sky.  We probably could have used that for keying the sky replacement in there.  Because Mank was shot monochrome, we didn’t have that blue color channel to help us tighten up a matte.  Savage built a 3D model of our location and wrapped in the sky so the camera could be positioned.  As the camera cut around Savage could say, ‘No we’re seeing that part of the sky.’  They used the Unreal Engine to do that.  That’s quite a lengthy scene and every sky in there is replaced.  It was methodically planned.”

More on AWN:

Territory Studio Transports ‘Mank’ Viewers to LA’s Wilshire Blvd. in 1934

Savage VFX’s Mank reels

The One and Only Ivan

Nomination: Outstanding Animated Character in a Photoreal Feature

There is no doubt that the success of the Disney+ feature depended upon the performance of the CG title character voiced by Sam Rockwell and animated by MPC.  According to production VFX supervisor Nick Davis, “For every single shot ,the animators would find clips on the Internet or shoot them of animals and we would go, ‘Start with that as the building block because it looks like that emotion.’ From that we would build out by being more expressive or pulling back a little bit.”  MPC built a highly complex facial rig for animating Ivan.  “We could have so much nuance around the eyes and mouth in order to give the subtlest of expressions because so much is said in silence by Ivan,” Davis says. “As humans, we’re so in tune with translating those tiny imperceptible motions like the cock of an eyebrow or the slight squint of an eye.”  MPC has a proprietary software system known as Furtility that can create photorealistic hair, fur, feathers, vegetation, and clothing.  “Every show enables us to make it more photorealistic,” Davis shares. “A lot of it is the level of attention to detail that we put into each fur render.  There were some technically complex situations where Ivan is finger painting. We had to deal with the physics of how paint behaves as it pushes against glass and then wraps around a furry finger.”  The friendship between Ivan and his canine pal Bob (Danny DeVito) drives the narrative.  “Because Thea Sharrock was able to direct Ben Bishop, who was our body performance artist for Ivan, and a little puppeteered version of Bob, she was able to explore the dynamics of their relationship,” notes Davis. “Then it became a more technical exercise as MPC started to animate and render that to get the final performance.”    

More on AWN:

MPC Film’s Digital Gorilla Delivers the Emotion in ‘The One and Only Ivan’

News of the World

Nomination: Outstanding Supporting Visual Effects in a Photoreal Feature

In Paul Greengrass’ western drama News of the World, extensive environmental work was done by Outpost VFX to recreate the American frontier of 1870. “We had the normal getting rid of passing cars and electrical polls,” remarks production VFX supervisor Roni Rodrigues, who was responsible for over 600 visual effects shots. “The scene where Captain Kidd is approached by a gang was shot in an open space, but Paul wanted a more menacing feeling.  A whole forest was added around the entire gang.  After a certain point you don’t see lights any more, only the thick bushes and trees.  There were several shots where Paul wanted us to go the extra mile and add more bushes and trees to make sure that we were creating the right tone for the film.”  The city of San Antonio was merely a little shed surrounded by bluescreen in the studio parking lot. “The shot was supposed to have not only the whole city but also a whole market on the left-hand side with cattle walking around and people dealing with each other,” Rodrigues explains. “We had two bluescreens and a street of dirt on the ground to simulate the road.  We had a matte painting adding some photoreal textures on top of the CG to give it a photoreal feeling. Outpost VFX was able to put together a beautiful city.”

Outpost VFX’s News of the World reels

Welcome to Chechnya

Nomination: Outstanding Supporting Visual Effects in a Photoreal Feature

Rewarded for its innovative approach to protect the identity of interviewees through the use of machine learning and face replacements is the documentary, Welcome to Chechnya, David France’s film that, through interviews with LGBTQ refuges, details the country’s anti-gay purges of the late 2010s. “Filmmaker David France was connected to Johnny Han [facial capture supervisor] who knows that I like to solve problems,” recalls Ryan Laney who served as the film’s VFX supervisor.  “We met with David and Alice Henty [producer] and they explained about wanting to hide identities so well that their parents wouldn’t recognize them.  We wouldn’t be here if David had known what he was asking was impossible.  The edit was almost locked by the time that they had found us.  It was not like a normal visual effects film where you get to plan the execution of things.  David was initially going for this Scanner Darkly effect but rotoscoping accentuates characters.  There was a paper out of the University of Glasgow on the attractiveness of composite faces. We thought if we had a good idea of what composite faces were that we could create a transfer function from one to another.  We tried that with a warp first and then learned deep machine learning style transfer ideas.”

Monster Hunter

Nomination: Outstanding Effects Simulations in a Photoreal Feature

Creatures are a big part of the principal cast for Paul W. S. Anderson’s video game adaptation, Monster Hunter, which includes 1,400 visual effects shots produced under the guidance of Dennis Berardi. Complicating matters was the need for interactive elements so that the CG characters would seamlessly blend into the live-action plates.  “The sand simulations pushed us way past every limit that we ever had in terms of simulation time,” observes Mr. X digital supervisor Ayo Burgess. “We had shots in the beginning with the sand ship where the turnaround time was almost two days to get all of the pieces of the simulation through.  On top of that we had a point replication procedure that allowed us to increase the density of the sand even more at render time.  Microsoft Azure cloud rendering which was hugely beneficial for pushing shots through and getting things rendered.  But the effects caches were so big we couldn’t upload them to the cloud. We had to render them all on premises; it was a real fight to get these heavy caches going without taking the entire studio farm throughout the movie.  It was a balancing act of getting the quality level that we needed but also getting the shots there so we could tell the story.” 

Trevor Hogg's picture

Trevor Hogg is a freelance video editor and writer best known for composing in-depth filmmaker and movie profiles for VFX Voice, Animation Magazine, and British Cinematographer.