We have just announced the release of the latest version of ARFoundation that works with Unity 2018.3.
For this release, we have also updated the ARFoundation samples project with examples that show off some of the new features. This post will explain more about the Face Tracking feature that is shown off in the ARFoundation samples.
The Face Tracking example scenes will currently only work on iOS devices that have a TrueDepth camera: iPhone X, XR, XS XS Max or iPad Pro (2018).
The Face Tracking feature is exposed as part of the Face Subsystem similar to other subsystems: you can subscribe to events that will inform you when a new face has been detected, and that event will provide you with basic information about the face including the pose (position and rotation) of the face, and its TrackableId (which is a session unique id for any tracked object in the system). Using the TrackableId, you can then get more information about that particular face from the subsystem including the description of its mesh and the blendshape coefficients that describe the expression on that face. There are also events to subscribe to which allow you to detect when the face has been updated or when it has been removed.
ARFoundation as usual provides some abstraction via the ARFaceManager component. This component can be added to an ARSessionOrigin in the scene, and it will create a copy of the “FacePrefab” prefab and add it to the scene as a child of the ARSessionOrigin when it detects a face. It will also update and remove the generated “Face” GameObjects as needed when the face is updated or removed respectively.
You usually place an ARFace component on the root of the “FacePrefab” prefab above so that the generated GameObject can automatically update its position and orientation according to the data that is provided by the underlying subsystem.
You can see how this works in the FaceScene in ARFoundation-samples. You first create a prefab named FacePrefab that has ARFace component on the root of the GameObject hierarchy:
You then plug this prefab reference in the ARFaceManager component that you have added to ARSessionOrigin:
You will also need to check the “ARKit Face Tracking” checkbox in the ARKitSettings asset as explained here.
When you build this to a device that supports ARKit Face Tracking and run it, you should see the three colored axes from FacePrefab appear on your face and move around with your face and orient itself aligned to your face.
You can also make use of the face mesh in your AR app by getting the information from the ARFace component. Open up FaceMeshScene and look at the ARSessionOrigin. It is setup in the same way as before, with just a different prefab being referenced by the ARFaceManager. You can take a look at this FaceMeshPrefab:
You can see that the mesh filter on the prefab is actually empty. This is because the script component ARFaceMeshVisualizer will create a mesh and fill in the vertices and texture coordinates based on the data that it gets from ARFace component. You will also notice the MeshRenderer contains the material that this mesh will be rendered with, in this case a texture with three colored vertical bands. You can change this material to create masks and face paints etc. If this scene is built out to a device, you should be able to see your face rendered with the material specified.
You can also make use of the blendshape coefficients that are output by ARKit Face Tracking. Open up the FaceBlendsShapeScene and check out the ARSessionOrigin. It looks the same as before, except that the prefab it references is now FaceBlendShapes prefab. You can take a look at that prefab:
In this case, there is a ARFaceARKitBlendShapeVisualizer which references the SkinnedMeshRender GameObject of this prefab so that it can manipulate the blendshapes that exist on that component. This component will get the Blendshape coefficients from the ARKit SDK and manipulate the blendshapes on our sloth asset so that it replicates your expression. Build it out to a supported device and try it out!
For more detailed coverage of the face tracking support, have a look at the docs.
This was a summary of how to create your own face tracking experiences with the new ARFoundation release. Please take some time to try it out and let us know how it works out! As always, I’m happy to post any cool videos, demos or apps you create on x.com