I’m currently fiddling around with the google cardboard SDK and am trying to find out, how to render different images to the left and right eye. A multi-camera setup did not work as it seems to be unsupported and looking into the github issues page the only way at the moment seems to use the stereo eye index in some way.
Is it possible to use this index in shadergraph in any way to control the shader behaviour? Apparently there are a lot of keywords like UNITY_VERTEX_OUTPUT_STEREO which is the eye index (?) or allows one to use them in a normal shader. But how do I access this from inside the shadergraph? Maybe with a custom shader?
EDIT: So to my understanding “unity_StereoEyeIndex” is the index I’m looking for. Is there a way to use this in shadergraph somehow?
I appreciate any helpful response, thanks in advance!
Ok everyone, i found a solution to my issue and I hope this helps anyone out there who seeks to solve this as well.
I gave it a show and just tried out to return the unity_StereoEyeIndex with a custom function node in shadergraph. And it worked! I found a few snippets which made me think, that “unity_StereoEyeIndex” is already used internally in shadergraph and I don’t need to use fancy keywords or anything like that at all.
Using the index, some extremely simple math (just subtract or do whatever suits your needs) and setting an Vector1 by code, gave me a way to render one object in different colors (and not only colors, you can customize the whole behaviour of it with this value) on the left and right eye without the need to set up multiple cameras.
And here is the code you need to add as component to the object which you want to make different for each eye:
[SerializeField] MeshRenderer renderer;
[SerializeField] bool right = true;
// Start is called before the first frame update
void Start()
{
renderer = GetComponent<MeshRenderer>();
renderer.material.SetInt("_rightEye", right ? 1 : 0);
}
Hi Desoxi,
I am working on a similar project to display 2 different images over 2 eyes. I am new to Unity and cannot get this work out from you steps above. Do you mind to share a simple sample on github or somewhere?
Thanks!
Great work! Does anyone have a unity 2022 using URP shader graph screenshot, please?
The closest I have is this screenshot but I still see the texture in both eyes in VR
For anyone else stuck I managed to solve this using a custom script inside the shader graph to render the stereo video textures.
Here is the StereoEyes.cginc code I used for the custom function
void StereoEyes_half(float3 WorldCameraPosition, float3 WorldCameraRight, float Force, float4 Left, float4 Right, out float4 Out) {
Out = Left;
if(unity_StereoEyeIndex == 0){
Out = Left;
}
if(unity_StereoEyeIndex == 1){
Out = Right;
}
}
void StereoEyes_float(float3 WorldCameraPosition, float3 WorldCameraRight, float Force, float4 Left, float4 Right, out float4 Out) {
Out = Left;
if(unity_StereoEyeIndex == 0){
Out = Left;
}
if(unity_StereoEyeIndex == 1){
Out = Right;
}
}
I can not tell you for sure, because still some performance issues are not solved in Untity 2021 and 2022 so I don’t use other versions than 2020 when I’m targeting Oculus Quest. I believe from 2021 Shader Graph includes a specific node for that called “Eye Index Node”.
I am using Unity 2020.3, but I am not getting Right eye without enabling Multi View rendering.
Can you suggest what can be wrong? I am using the same graph as you shared above. I have Quest2 in which when i see get only one eye.
Thanks in advance.
RYG81
Is it possible for this whole thing to be exported so that I can use it? This is exactly what I’m looking for. I’m struggling to reconstruct it myself because I’m new to the shader graph but if I had a working copy I could inspect it and learn from it.
For everybody who joined late to this thread: There is no need to write your own code for this. Shader Graph’s Eye Index Node works just fine in all modern Unity versions. BUT it apparently does not support Multi Pass.
So if you’re working with a Quest 3 headset for example, you need to go into Project Settings > XR Plugin Management > Oculus and set Stereo Rendering Mode to Single Pass Instanced for Desktop Plattform and to Multiview for Android. After changing this, it worked just fine for me.
Hi, this is my first time using sader graph unity. How do you get your image into each sample texture? I can’t seem to be able to use mine. I’m trying to have different videos (MP4 files) projected in each eyes
Note: It is better to fuse the two video in one to be sure they stay in sync when played. It is hard to keep them at the same exact timing. (In my example it was just two photo)
If you want to have a video, you can use “Out = unity_StereoEyeIndex;” to move the video of 0.5 to the right (if your video is in “Left Right” format)
If you need to keep the two video split, you can copy what I did but use RenderTexture in the material of the shader. Then play your video in the RenderTexture.
This worked in my case, thanks; Unity 6000.0.34f1 ; in the shader files unity_StereoEyeIndex was always 0, but if instead I made a shader graph using Eye Index, with Single Pass Instanced, that worked.