Example data for anchorData.faceGeometry

I don’t own a iPhone X yet, but I would like to do some initial research.

Can anyone give me an example of the data stored in anchorData.faceGeometry, which is based on the public struct UnityARFaceGeometry in ARFaceAnchor.cs?

According to the documentation this is always the same for vertexCount, textureCoordinateCount, textureCoordinates, triangleCount and triangleIndices. Only the values of vertices change. They however don’t give an example of the data.

Inside UnityARFaceGeometry:
What are the default values for vertexCount, triangleCount and triangleIndices, that ARKit passes on to Unity?
And what values does vertices contain for a neutral face?

p.s. related question: ARKit function for blendShape--> faceGeometry to modify face mesh

Could someone do the following for me?:

Under UnityARKitPlugin/Examples/FaceTracking/ open FaceBlendshapeScene.
In the same folder, open UnityARFaceMeshManager.cs

Put the following in void FaceAdded (ARFaceAnchor anchorData) (around line 51):
Debug.Log(anchorData.faceGeometry);
Debug.Log(anchorData.faceGeometry.vertices);
Debug.Log(anchorData.faceGeometry.vertexCount);
Debug.Log(anchorData.faceGeometry.textureCoordinates);
Debug.Log(anchorData.faceGeometry.textureCoordinateCount);
Debug.Log(anchorData.faceGeometry.triangleIndices);
Debug.Log(anchorData.faceGeometry.triangleCount);

Run the scene and detect your face with the iPhone X

And please paste the console output here. I would be really grateful for that.


And if you want to do the following and put the output here, I would be even more grateful:
Open BlendshapePrinter.cs (same folder) and put the following code under void FaceAdded (ARFaceAnchor anchorData) (around line 53):
Debug.Log(anchorData.blendShapes);

Run the scene and detect your face with the iPhone X and paste the console output here. Thank you.