I’m trying to add VR support to an existing custom render pipeline using single pass instanced. Does anyone know if there are any examples, tutorials, or documentation for this?
This is what I’ve done so far, mostly through guesswork and poking around in the URP code:
-
Added the XR plugin management package and set it to initialize on startup
-
Added the oculus package and set it to Single Pass Instanced
-
Changed my call to SetupCameraProperties() to pass true for the stereoSetup param
-
Set up instancing like this:
InitCommandBuffer.EnableShaderKeyword(“UNITY_STEREO_INSTANCING_ENABLED”);
InitCommandBuffer.EnableShaderKeyword(“STEREO_INSTANCING_ON”);
InitCommandBuffer.SetInstanceMultiplier(2); -
Set my test mesh to use a simple shader that has instancing support via the
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO and
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX macros -
Getting the XRDisplaySubsystem and using it to fill in the CullingParameters instead of using the camera
-
Rendering to the XRRenderPass renderTarget instead of the camera render target
-
Calling renderContext.StartMultiEye and StopMultiEye around draw calls that should be rendered in stereo
-
Calling renderContext.StereoEndRender(camera) at the end of rendering, although I’m not sure if it should go before or after calling Submit()
It renders to the headset but only in the left eye. Looking at the “both eyes” view on the game view in the editor shows the same thing: Left eye looks correct, right eye is black.