I wanted to see if I can get my (old) Vive Pro Eye eye tracking working with just the OpenXR loaded. So not using the ViveSR plugin. Just a dummy scene with XR Rig. Added a sphere that should follow my gaze. So this runs in my Update():
private static readonly List<InputDevice> InputDeviceList = new List<InputDevice>();
private InputDevice eyeTrackingDevice = default(InputDevice);
if (!eyeTrackingDevice.isValid)
{
InputDevices.GetDevicesWithCharacteristics(InputDeviceCharacteristics.EyeTracking, InputDeviceList);
if (InputDeviceList.Count > 0)
{
eyeTrackingDevice = InputDeviceList[0];
}
else { Debug.Log("No eye trackers found!");
return;}
}
}
else
{
wasEyeTrackingValidLastFrame = true;
bool hasData = eyeTrackingDevice.TryGetFeatureValue(CommonUsages.isTracked, out bool isTracked);
hasData &= eyeTrackingDevice.TryGetFeatureValue(EyeTrackingUsages.gazePosition, out Vector3 position);
hasData &= eyeTrackingDevice.TryGetFeatureValue(EyeTrackingUsages.gazeRotation, out Quaternion rotation);
if (isTracked && hasData)
{
transform.localPosition = position + (rotation * Vector3.forward);
transform.localRotation = rotation;
}
It finds the tracker and it returns data that looks sort of okay, but is off. So, I see the little sphere match my eye movements, but it’s offset and the range (angles) is too small.
Now as I understand position is the mid-eye position as seen by the tracker and rotation is the combined look direction of your eyes. Combining these two should get a vector starting at mid-eye and pointing towards the gaze direction.
I’ve also tried adding a Physics.Raycast but looking at those results I see it is also off.
Anyone know why it does work but the results are … off?