I want to see different canvases inside left and right eyes (displays) of the Oculus Quest 2. I have created two identical canvases called left canvas (which is want it to be shown in left eye) and right canvas (which is want it to be shown in right eye) with only difference left one has a text that it tells it’s the left canvas. Then, i have added an XR Origin and duplicated them so that they are in the exact same place just like canvases. Then, created 2 layers. One for left eye and one for right eye and set layers of canvases and culling masks of cameras accordingly. But, i couldn’t find left and right targeting eye options and currently i can only see left canvas on the Oculus Quest 2 in Unity’s playmode. Here is the camera settings when i select targeting eye of both left and right cameras:
This is the left camera:
And this is the right camera:
Here is my left canvas settings:
Here is my right canvas settings:
After setting canvases and cameras and testing them in playmode, i wrote a code for doing Debug.Log() for both cameras and here are the result for both cameras:
Is it impossible to show different things for each eye using OpenXR? Can i use a single camera with targeting on “Both” and render left and right eyes at the same time into left and right halfs of the camera view?
Unity does not support 1 camera per eye anymore (I think since 2020 or 2021).
In Shader Graph you might be able to use the index to make a UI shader eye Eye Index Node | Shader Graph | 12.0.0
Just tried to open the project with 2018.1.6.f1 and Unity have crashed. What about the shader approach? Does shaders have effect on entire canvas o should i apply the shader to each GUI element (i am asking since i am a complete beginner in shaders)
Thanks. Just checked on Oculus Integration. Is it true that it can work without any issues with Oculus XR Plugin, OpenXR Plugin, XR Interaction Toolkit and XR Interaction Manager? And that the only thing i need to do is to switch XR Origin with OVRCameraRig? (My concern was if i was going to change eveything about input when using OVR instead of OpenXR)
Thanks for the link you have sent, i just removed URP from my project and it worked. Now, i can select target eye for my both cameras (even though i stayed on XR and didn’t switch to OVR) and see different canvases without even setting and assigning layers in 2021.3.25f1 (LTS). Thanks again for all the advice and suggestions you have offered.
If you would prefer to use URP, this is something I played with in a personal project a little. In my example, I had a custom shader graph that used an input texture where the Red channel was the multiplier for the left eye and the Green channel was the multiplier for the right eye. I used this as a custom node in a shader graph:
The important bit is that you can do something like set output color alpha to zero when the eye ID (unity_StereoEyeIndex) doesn’t match the eye you want. I believe 0 is typically left, 1 typically is right.
Thanks for this. I followed from another thread and have a shader meant to render one texture to each eye using eye index node. Screenshot of how I set it up - I’m new to shader graph and would appreciate any feedback
Do I have it right that this will only work in 2018.1.6.f1 and before? I’m in 2022.3.2f1 so that’s perhaps my real issue. Thank you for ideas.
Hey could you explain this a bit because I’m new to the shader graph. I was expecting something like an input texture as a variable, then some conditional logic to render part of that image to the left eye and part to the right. In particular side-by-side stereoscopic images is what I’m trying to make work.
“leftEye” and “rightEye” are input textures. The EyeIndex node will return 0 for left eye and 1 for right eye. The Lerp could also be replaced with a branch node, its just there to draw the “leftEye” to the left eye.
The left eye and the right eye are drawn to different textures, so there is no UV math to stitch the left image to the left half of the texture and the right image to the right half of the texture.
This is to draw to an object the HMD is seeing, not record what the HMD is seeing.
Put this material on a cube, fill the textures with 2 different textures. And you will see the cube textured with different textures on your left and right eye.