I’m doing motion capture in my app using ARKit. The blendshapes work fine, but I also want to read the head rotation relative to the camera. Currently, I do this by just reading out the rotation of the ARFace GameObject after one has been detected/created:

In ARCore, this works fine and as expected. In ARKit however, this ARFace rotation also includes the session space rotation. In other words, when I hold the iPhone in front of my face and spin in place, the face y-angle (yaw) will spin with me. I’m pointing the phone straight at my face the whole time, so I’d expect all three angles to stay at or near zero.

I’ve tried calculating angles between the camera GameObject and ARFace, but nothing seems to work so far.
Has anyone faced this issue before or can point me in the right direction? Any help would be greatly appreciated!

If needed, I can provide some videos to illustrate the issue.

Here is a quick example I wrote with the help of my plugin. It calculates the face rotation relative to the camera. I would suggest doing the math in Quaternions rather than in Euler angles because Euler angles don’t represent rotation near 0 and 360 in a meaningful way. For example, -10 degrees of rotation will be represented as 350 degrees.

Please attach the script to face prefab and populate references.

using UnityEngine;
using UnityEngine.XR.ARFoundation;
public class FaceAngleRelativeToCamera : MonoBehaviour {
[SerializeField] ARFace face = null;
[SerializeField] ARSessionOrigin origin = null;
[Header("Debug")]
[SerializeField] Vector3 eulerAnglesRelativeToCamera;
void Update() {
var rotationRelativeToCamera = Quaternion.Inverse(origin.camera.transform.rotation) * face.transform.rotation; // this quaternion represents a face rotation relative to camera
eulerAnglesRelativeToCamera = rotationRelativeToCamera.eulerAngles; // using euler angles is almost always a bad idea
Debug.Log($"eulerAnglesRelativeToCamera: {eulerAnglesRelativeToCamera}");
}
}

Calculating the rotation difference and converting them to Euler angles afterwards like that is something I tried out.
It behaves a little strangely, but I’m not too deep into quaternion math so maybe you can make sense of it.

Basically, after converting the difference to Euler angles, only the y-value seemed to be always as expected (yaw, rotating head left/right). The x-value (pitch, head up/down rotation) and z-value (roll, leaning head to sides) seem to be linked somehow. For example, when I have my head leaned to the side and then rotate it down, it also strongly affects the z-value as well. No such thing happens for y, that component seems to be always independent, just the way I want/need it.

What I currently do is this:

// Get current face angles.
var faceAngle = face.gameObject.transform.rotation.QuaternionToEuler();
// For y-component, use angle difference between camera and face.
Quaternion differenceRotation = faceTransform.rotation * Quaternion.Inverse(cameraTransform.rotation);
Vector3 differenceAngles = differenceRotation.QuaternionToEuler();
faceAngle.y = differenceAngles.y;

Taking the x and z component from the regular face angles is fine as they don’t seem to be affected by me walking around or spinning in place with the phone. This problem only occurs with the y-component when walking around or spinning in place with the phone in front of me, so only taking the y-component from the difference quaternion is a good solution for me.

Does this make sense to you at all? I’ll be playing around a bit more with it this weekend.

Also, your plugin looks amazing, I’ll definitely take a look. Wish I knew about that a couple months ago, would’ve saved me a lot of time.