Search form

Dynamixyz and Pixelgun Studios Introduce Scan-Based Facial Animation

New ‘Performer’ markerless facial animation software provides facial tracking trained directly from scans as well as a direct solve onto a Maya rig.           

CESSON-SÉVIGNÉ, FRANCE – Dynamixyz, an innovator in markerless facial animation technology and products, along with Pixelgun Studios, a 3D high-end scanning agency, have just released a proof of concept demonstrating facial tracking trained directly from scans, along with a direct solve on a rig in Maya, delivering high-fidelity raw results.

This technology breakthrough allows users to track and solve to the digital double of an actor with a high level of precision. The process automatically extracts key poses from scanned data solving them directly onto a rig, taking advantage of its intrinsic logic.

From Facial Mocap Specific Tracking Technology

Performer – Dynamixyz’s core markerless tracking software, relies on a specific tracking technology, based on machine learning, that requires two essential steps in its workflow:

  • Building the Tracking Profile: this first step consists in manually annotating key poses of an actor (from a video source) to give the software examples. This step usually takes up 2 to 3 hours.
  • Building the Retargeting Profile: This second step requires that the animator adjusts controllers in the 3D software to retarget those key poses to the dedicated character. While taking less time than building the tracking profile, building the retargeting profile is also done manually. This step brings flexibility as the underlying machine learning process makes it possible to retarget to any character and any rig.

Dynamixyz’s R & D team has always focused on both reducing time spent on training the system and increasing the quality of the results to save animators time during production.

Scan-Based Tracking

Dealing with massive amounts of scans through the years, it became obvious to the team that the data should be reused to assist in specializing the tracker to the actor’s morphology. While scanned data was very hard to collect just a few years ago, scanning technology has become more accessible and affordable. Studios have developed pipelines including photogrammetry or 3D scans and can now handle the process and deliver clean data efficiently.

Visual Concepts, a Dynamixyz client, introduced them to Pixelgun Studio, which provides scans for the game, NBA2K, among other titles. The two companies joined forces to work on a scan-based facial mocap workflow.

The stereo head mounted camera and motion capture for the performance were conducted at the same time as the scanning session by Pixelgun Studio. Scanned data provided information on geometry and textures (like wrinkles or other very peculiar face features) that enabled the retrieval of very accurate information for both morphology and appearance. Pixelgun delivered registered scanned data with textures covering 80 expressions taken with 63 cameras trained on the head for expression capture, and 145 cameras trained on the subject for body capture.

As lighting conditions are key when training the tracking, manual annotation is still required for 2 to 3 frames extracted from the production shots to retrieve the illumination pattern. The scanned data the Dynamixyz team received from PixelGun enabled them to recreate a synthetic double with illumination recovery and extract key poses from the digital double as if they were frames recorded with a head gear.

The tracking was processed automatically based on geometric and textures information, almost completely replacing the annotation step.

“With these images generated as if they were taken with a head mounted camera and geometry information, we were able to build a Performer tracking profile as if it had been annotated manually,” R & D engineer and author of the scan-based solver technology, Vincent Barrielle, explained. “It brings higher precision as it has been generated with very high-quality scans. It also gives the opportunity to have a high volume of expressions.”

Tracking from scanned data reduces the error-prone and time-consuming step of manual annotation and is far more accurate as it comes from the specific face information of the actor included in the scans.

Solving Directly to the Rig Taking Advantage of its Logic

The Dynamixyz R & D team pushed forward the proof of concept, thinking that the true usable outcome was to solve directly on the controllers of the rig, rather than solving on the scans. To this end, the team developed a system that exports and reproduces the compute graph of the rig logic and duplicates all the nodes type into the Dynamixyz system in order to run simulations. This step depends on the rig logic and still needs to be tested and approved on a wide variety of rigs.

“We are able to extract the rig knowledge from the Maya scene and use it to build a custom solver that will find out how the rig works from tracking data,” Barrielle added. “It is demonstrated right now for a use case in which you want to animate a digital double, an exact clone of the actor based on scans. We can totally imagine transferring this process onto a completely different rig, another character (human or not) in the future. The real difference here is that our solving is not based on a non-exhaustive set of expressions anymore. It now knows all the possibilities of the rig and is able to navigate through it to find the right settings for a dedicated expression."

“This technology is still at the POC level, but we aim at making it accessible through a product that is scheduled to ship in 2020,” Dynamixyz CTO and head of R & D, Nicolas Stoiber, remarked. “We believe other companies may have already developed such workflows in-house, rather as a tailor-made solution than as a packaged software. This is really at the heart of Dynamixyz’s DNA: we make technologies accessible and usable for the whole industry. When Performer was launched, markerless tracking technology already existed in high-end studios. We just made it available for everyone as we industrialized the tech, offering it as a software: Performer. We are now aiming at developing the same pattern for our scan-based tracker and solver.”

Dynamixyz is currently proposing an early-access program to some key partners allowing them to take advantage of the new technology and improve their workflow, prior to the official launch next year.

Another Powerful and Promising Tool in Dynamizyz’s Facial Motion Capture Factory

Dynamixyz solutions can be used for real-time. Their Performer tracking software has proved to deliver high-quality data and boost productivity significantly while adapting to any pipeline. Live Instant and Live Pro were launched at the end of June. Live Instant is based on a new “person-independent” tracking technology that allows users track any face instantly without any training of the software. The fidelity level remains below that delivered by the specific tracking or scan-based tracking; however, it is helpful for virtual production and secondary character animation. This technology is also more affordable for smaller budgets. The scan-based tracking technology will soon be added to the available software.

Dynamixyz solutions adapt to every use whether it’s for real-time, production, single-view or multi-view, VFX, VR, video games or event industry.

Major Clients in Video Games and VFX Industries

Dynamixyz technology is used by video game studios, VFX and VR studios. Video games studios easily integrate Performer into their existing pipeline, saving time batch-processing data. Rockstar Games used Performer Factory Single-View for Red Dead Redemption 2, which claims 500,000 lines of dialogue and 300,000 animations. VFX studios have also shifted towards Dynamixyz solutions. French studio, Unit Image, used Performer Multi-View for Netflix’ Love, Death and Robots episode “Beyond the Aquila Rift” in 2017. Big Company turned to them to animate Miraculous Ladybug for a Disney channel YouTube Live chat last January. Framestore used Performer to animate Smart Hulk in some of 300 VFX shots in Avengers: Endgame released in April.

Source: Dynamixyz