It appears that you can actually use iPhone X to deal with facial animation in Unity.
The developers managed to stream blend shape parameters live from iPhone X into Unity to control animation rig.
The demo consists of two parts. The iOS app and the Unity extension host.
You can get it here.
The iOS app streams the Blend Shapes Apple provides in ARFaceAnchor.blendShapes to the Unity host through a UDP socket. Essentially emitting a stream of messages, each with 50 blend shapes in the format ‘blend-shape-name:blend-shape-value’.
There are lots of performance improvements to be made here but it works for the purpose of a demo.
You can get it here.
Inside of the Unity host we have an extension which opens up a UDP socket to listen for the iPhone’s messages. When it receives a message it applies the blend shape values to the corresponding blend shape on the rig.
You can learn more about the project here.