IKinema Showcases Voice-Activated Animation for VR at SIGGRAPH 2016

INTiMATE technology demo shows animated creatures responding to voice commands in Virtual Reality and Augmented Reality environments.

FARNHAM, UK -- Ikinema, a leader in real-time inverse kinematics, revealed at SIGGRAPH a technology demo showing virtual characters responding in real time to voice commands. The company is known in the film and games industries for its high quality motion capture solving technology and realistic procedural animation capabilities. However, IKinema chief executive Alexandre Pechev sees a mass-market opportunity for voice-activated animation in virtual (VR) and augmented reality (AR) environments. 

“Imagine a world where we can interact with virtual characters by talking to them, just like in the real world,” said Pechev. “Real and virtual worlds are going to merge. Virtual characters will be part of everyday life. But to be credible, the virtual world needs to behave like the real world. At IKinema we are firm believers that animation needs to be procedurally generated in a very realistic way so that it is indistinguishable from the real world. Going 100% procedural is a difficult task, but a natural solution is to combine real motion with procedural adaptation, all driven by voice commands. This is project INTiMATE.”

For VR applications, the technology allows the user to interact with a virtual character by communicating with them. This could be in-game, for training or simulation purposes, for virtual production or story boarding, or for a VR experience.  

For AR, the real world is superimposed with virtual characters and worlds. INTiMATE allows the user to communicate with, and demand actions from, the virtual characters – whether virtual pets, board game heroes, personal trainers, virtual friends or more.

Project INTiMATE – linking motion to words

Project INTiMATE takes a library of motions and converts these to a runtime rig. These motions are then linked to natural language words so the transition from one animation to another can be generated by simple voice commands. Procedural blending and adaptation is then added so that the base animations are customized to the surrounding environment, with the result that everything looks as natural and believable as the real world.

The technology concept was first shown a year ago. Since then, with the backing of the UK government’s Innovate UK program, Project INTiMATE has become a reality, with the demo showing highly realistic animated creatures responding to simple voice commands, as well as interacting with the environment.

The technology is expected to be commercially available later this year. IKinema is actively seeking partners to apply the technology in a variety of sectors and aims to make an SDK available to any animation package.

Source: IKinema