In my application, I want to use eye-tracking to determine if a user looks at a certain GameObject. From the eye-tracker I get positions in “normalized screen space”, which is comparable to the viewport positions in Unity: A vector in range [0;0] - [1;1].
To determine the position of the GameObject on the screen, I use Camera.WorldToViewportPoint. This works fine as long as VR is disabled. As soon as I enable VR, the calculated viewport position is wrong (except in case the GameObject is in the center), it seems like a nonlinear distortion.
Do I need to apply another conversion after WorldToViewportPoint?
Here is an example of the issue, which displays the viewport position and overlays a GUI element using the screen position. I modified the code from this question and attached it to the cube:
void OnGUI()
{
Vector3 viewportPos = Camera.main.WorldToViewportPoint(transform.position);
Vector3 screenpos = Camera.main.WorldToScreenPoint(transform.position);
GUI.Label(new Rect(30, 30, 100, 30), viewportPos.ToString());
GUI.Box(new Rect(screenpos.x - 10, Screen.height - (screenpos.y + 10), 20, 20), "X");
}
The correct result (VR disabled):
The incorrect result (VR enabled):
As you can see, there is no GUI element displayed (the calculated screen position is off limits) and the calculated viewport position is wrong as well. I understand that the screen positions can be wrong in case the values (in pixels) are calculated for the screen in the HMD and then are used to position GUI objects on the secondary monitor, where the window has other dimensions than the HMD.
But the viewport positions should work correctly, or am I missing something?
Many thanks!
For VR, I used both the standard Unity Camera and the HTC Vive Camera Prefab - same effect with both.