Search form

IKinema Simplifies VR and Live Mocap Workflow with Launch of Rig Editor in LiveAction v2

Developers can create virtual production setups in Unreal Engine 4 without the use of third-party tools.

GUILDFORD, UK -- IKinema, a leader in real-time inverse kinematics, has announced the second generation of its LiveAction retargeting and solving software for VR and virtual production. LiveAction v2 features an integrated full body rig editor that allows developers to create content and experiences directly in Unreal Engine 4 without the use of third party tools.

Designed for live performance capture in the virtual space, IKinema LiveAction is also used in film, TV and game production. The software allows content creators to accurately retarget actors’ movements to animated avatars, regardless of their relative proportion, and to view their interaction in highly rendered virtual environments in real time. It features the same core technology as Action, IKinema’s solution for post production, providing the same degree of accuracy in the live scene. The technology also benefits consumers of VR content, allowing them to animate characters in real-time as well as interact with their virtual environment.

IKinema CEO Alexandre Pechev commented, “Customers tell us that LiveAction outperforms other tools due to the quality of data it delivers directly into the game engine. The result is far truer to the actors' original performance, the detection of these nuances for realism is what sets us apart. By adding the rig editor, we’ve simplified the workflow and developers save time by remaining in LiveAction during asset setup. No more jumping between applications, they can choose to do everything within LiveAction for Unreal Engine 4.”

In addition to the rig editor, which is also available in RunTime and RunTime-Indie, LiveAction v2’s real-time features include:

  • automatic correction of sliding feet, locking them to the ground
  • automatic correction to floors and obstacles to eliminate feet and hand penetration
  • noise reduction and correction filters that automatically clean defects in mocap during live streaming

IKinema at GDC 2016

OptiTrack (Booth 2116) and Xsens (Booth 631) will demonstrate LiveAction at the show.

“We’re very excited about showcasing a live, interactive VR experience at GDC this year, enabled by our market-leading motion capture technology and the latest IKinema character retargeting tools,” remarked OptiTrack Chief Strategy Officer Brian Nilles. “It’s all about being able to retain total accuracy between the real and animated worlds in real-time, so that you remain fully immersed in the experience. You can't miss this." 

Xsens Product Manager Hein Beute concluded, “Live Action will save our customers a lot of time, as it is plug and play to integrate Xsens MVN with Unreal Engine 4. Real-time applications in live entertainment, virtual production and virtual reality will get an upgrade with Live Action, as it provides the highest level of realism we have seen in the market.”


LiveAction v2 is available from Wednesday, 16 March 2016 for download on IKinema’s website. Based on OpenGL, it is available on the Linux, Mac and Windows platforms.

Source: IKinema