Kite & Lightning, a cinematic VR company, has shared a real-time motion capture test set up with the help of Unreal, IKINEMA, Xsens, and iPhone X.
Kite & Lightning, a cinematic VR company, has shared a real-time motion capture test set up with the help of Unreal, IKINEMA, Xsens, and iPhone X. The thing is that the whole thing looks like something set up beforehand. Check out the video and join the team to learn more at this year’s SIGGRAPH.
We’ve gone REALTIME! By adding a few new amazing ingredients to the mix… Unreal Engine and IKINEMA LiveAction!
It’s crazy how this setup, which is capturing and rendering in realtime looks damn close to my previous tests rendered in Vray! Hat’s off to Epic & collaborators for letting all of us pillage their Digital Human scene (a ton of bleeding edge work on their part), which still freaks me out to light and render at this level in real-time! It’s kind of a lifelong dream come true.
More hats off to Joezak Steel (aka Joe & Zak) at Epic for helping me with the ARKit facial capture stuff!
I’m over the moon how cool this setup has become and excited to show off more of the details live at Siggraph RealtimeLive in August!
Kite & Lightning
Make sure to discuss in the comments below.