Is single pass stereo rendering working using camera.Render()?

We have a portal in game, and render images in it using camera.Render() method.
Right now we have two separate cameras that generate image for each eye.
Is it possible to use single pass stereo technique to save performance on vision OS? If so, how?

1 Like

Building on Vision Os is natively in Single Pass if we can trust what is writen in the XR Plugin Management

1 Like

Thanks, yeah it is also mentioned in documentation.

Leverage Single-Pass Instanced Rendering - Unity’s Single-Pass Instanced Rendering now supports the Metal Graphics API and will be enabled by default. This reduces the overhead of certain parts of the rendering pipeline like culling and shadows, and also helps to reduce CPU overhead when rendering your scenes in stereo.

However is this limited to rendering on the device itself (i.e. from player eye camera)?

I’m struggling to make unity cameras render in single pass stereo using camera.Render(). They are set up to render for both eyes, however they don’t, during runtime camera.stereoEnabled returns false, and all XR related settings are disabled. I ran this code:

Debug.Log("XR is device active: " + UnityEngine.XR.XRSettings.isDeviceActive);
Debug.Log("XR texture desc width: " + UnityEngine.XR.XRSettings.eyeTextureDesc.width);
Debug.Log("XR texture desc height: " + UnityEngine.XR.XRSettings.eyeTextureDesc.height);
Debug.Log("XR texture width: " + UnityEngine.XR.XRSettings.eyeTextureWidth);
Debug.Log("XR texture height: " + UnityEngine.XR.XRSettings.eyeTextureHeight);
Debug.Log("XR texture resolution scale: " + UnityEngine.XR.XRSettings.eyeTextureResolutionScale);

and this was the output:

which made me question whether this is supported.

1 Like

I’ve also had problems rendering stereo images…
I was trying to make a portal effect with stereo render textures …
I had to abandon the idea and find another way to do it because I have the impression that polyspatial isn’t ready for it yet.