Using native OpenVR, is there any way to get what the eye position and rotation would be for each eye?
I’m trying to achieve a portal effect. In 5.3, the SteamVR plugin would set the position of the camera before rendering each eye. So I was able to do something similar to get a portal effect: I would match up my portal camera’s projection matrix with the VR eye camera, and then before rendering it to a RenderTexture, I would set its localPosition and localRotation to the VR eye camera’s position and rotation. So the RenderTexture was being rendered twice each frame: once for each eye.
With the native OpenVR implementation, I’m not sure how to achieve the same effect. I’m thinking I could set up 2 cameras for the portal: 1 for each eye, then render those into a couple of RenderTextures. Then I would need to have an extra VR eye camera that only renders the RenderTexture from the right-eye portal camera.
The trouble is I haven’t been able to work out how to properly get the eye position and rotation for each eye. At least not in a way that doesn’t make my eyes hurt.
Or maybe there’s a better solution now for creating a portal effect?