The iPi Soft founder and CEO discusses the current state of markerless motion capture animation technology, including real-time tracking and live streaming.
The markerless motion capture market continues to evolve with industries beyond the entertainment/video game creation world, with new sectors such as medical and architecture finding the software increasingly helpful. Michael Nikonov, founder and CEO of Moscow-based iPi Soft, developer of the leading markerless motion capture solution iPi Motion Capture, continues to stay true to the company’s motto “Motion Capture for the Masses,” most recently delivering real-time tracking and live-streaming capabilities into popular game engines such as Unity and Unreal.
In a quick chat with AWN, Nikonov spoke about the trends he sees in markerless motion capture, and specifically what enhancements iPi Soft users can look forward to seeing in the near future.
AWN: Last year iPi Motion Capture added real-time tracking. Was it the game-changer you thought it would be?
Michael Nikonov: Adding real-time was a huge developmental milestone for the company. It was a significant technical achievement for us just to make it happen and it’s something we’re tremendously proud of. But, that said, it was one step for us. We are focused on other improvements, specifically regarding the overall motion tracking user experience of the software.
MN: We recognize that for some users working with more than one camera, configuring and calibrating the system can be a challenge. Our development team is working towards simplifying this for a more user-friendly motion tracking experience.
Some examples of this include distributed video recording, which helps to minimize cable clutter; various approaches to camera synchronization; the ability to control real-time tracking; and live streaming to Unity via our Automation Add-on and other improvements.
AWN: What are the big trends you see in the motion capture industry?
MN: We see opportunities for markerless motion capture primarily in the entertainment/gaming sectors and increasingly in the biomedical/scientific research world. Architectural design firms are also using motion capture, but the majority of our users are professional and semi-professional animators and digital artists.
AWN: What specific workflow benefits does iPi Soft’s Unity and Unreal game engine integration provide to iPi Mocap customers.
MN: One of the most important development improvements we made with regards to the Unity game engine was to enable live streaming from our software to Unity. We’re currently working on delivering this to the Unreal game engine, which should be online by 2021. Live streaming integration with game engines is essential for animators because they need to see their character models in the gaming environment as quickly as possible to decide if a scene works or needs to be redone or edited in some way. This eliminates artists having to constantly wait on their scenes to render and quickens the creative workflow.
We also added a preset for Unreal in iPi Mocap for working with their standard bipedal characters. In recent versions of Unreal Engine, the standard skeleton is stable so now it is as easy as selecting ‘Unreal’ from the menu of available characters and rigs.
AWN: As a huge fan of video games, have you seen anything recently that is particularly worth mentioning?
MN: I am amazed how indie game developers can create such complex and beautiful games, with great looking animation, with such small teams. The End of the Sun game is one recent example. In their Kickstarter video they explain how they brought animation from iPi Motion Capture software to the Unity game engine and their first experience with motion capture.
AWN: Any plans for an iPi Motion Capture app?
MN: We actually had a meeting last summer with a major cell phone manufacturer, who after watching a demonstration of our software, asked us that same question. Unfortunately, right now, the answer is no. At present, markerless motion capture is simply too computer-intensive to exist as an app. A simplified body tracking app is quite possible, but it is not accurate enough for animation needs, so I would not call it “motion capture.” But who knows what the future may bring.