Handling Facial Animation in Unity with iPhone X

It appears that you can actually use iPhone X to deal with facial animation in Unity.

It appears that you can actually use iPhone X to deal with facial animation in Unity. Simeon Saëns and John Smith state that the ARKit face APIs are powerful enough to produce useful animation. And the thing is that you can capture it in real-time.

Face Demo

The developers managed to stream blend shape parameters live from iPhone X into Unity to control animation rig.

How does it work?

The demo consists of two parts. The iOS app and the Unity extension host.

iOS App

You can get it here.

The iOS app streams the Blend Shapes Apple provides in ARFaceAnchor.blendShapes to the Unity host through a UDP socket. Essentially emitting a stream of messages, each with 50 blend shapes in the format ‘blend-shape-name:blend-shape-value’.

Live Demo

There are lots of performance improvements to be made here but it works for the purpose of a demo.

Unity Extension Host

You can get it here.

Inside of the Unity host we have an extension which opens up a UDP socket to listen for the iPhone’s messages. When it receives a message it applies the blend shape values to the corresponding blend shape on the rig.

How to run the project

  • Clone and open the Unity project from here.
  • Run the Unity project’s scene
  • In the menu bar select iPhoneMoCap -> MeshPreview
  • Enable Mesh preview
  • Grab the iOS app project from here
  • Make sure your iPhone X is connected to the same Wifi network and build / run this application. (Don’t forget to pod install)
  • This application should discover the unity host and begin streaming the motion data.

Simeon Saëns and John Smith 

You can learn more about the project here.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more