Getting Screen Coordinates for Single Pass Stereo VR?

I’m trying to make a basic depth shader for VR using LWRP, which means I’m required to use single pass stereo. I’m aware that this means my depth texture is twice the width of each eye.

I can’t seem to find a way to correctly sample this depth texture so that it would appear the same in stereo as it would in mono. Each eye has the same screen coordinates, and so the double-wide depth texture looks like this.

I would like to ultimately have the left eye go from 0 to 0.5 and the right to go from 0.5 to 1, which would accurately sample the depth texture. But all of the screen position nodes seem to give the same coordinates for both eyes.

Is this currently possible in shader graph? Or should I find a workaround?

Thanks!

1 Like

Follow up: I’ve upgraded from 5.7 to 5.13 to take advantage of the Custom Node

I was able to manually adjust the UV coordinate by checking if “unity_StereoEyeIndex” is greater than 0, which was true for right. This was sufficient to properly sample the depth texture, and will probably be useful for any refraction shaders as well.

It would be convenient if this were integrated into the Screen Position node somehow.

Hi can you share the shader graph or image for it? I am facing similar issue.

1 Like

Sure!

OUT = IN;
float4 scaleOffset = unity_StereoScaleOffset[unity_StereoEyeIndex];

//check that stereo is enabled
if (scaleOffset.x > 0){

    OUT.x /= 2;
    OUT.x += scaleOffset.z;

}

I’m using the 2019.1 custom shader node instead of the CodeFunctionBlocks, OUT and IN are both vector4s. Use the “default” screen position as input, and your output should be good to plug into the Scene Color or Scene Depth nodes. This also won’t mess up the appearance in scene view.

Sorry but I am not so good with coding. Can you explain where to use this code? In shader graph custom function node?

Yup! Make a custom node. Give it a Vector2 input named IN and a Vector2 output named OUT. Name the function whatever you want, and paste this code in to the text box.

Side note: I recommend you make a sub shader graph to hold the custom node. Otherwise, if you want to use this node more than once, you’ll have to change the function’s name. And if you ever make edits to the code, you’ll have to remember to make the same edits with the other nodes.
But if you use a sub shader graph, you can use the sub shader as much as you want. without worry.

2 Likes

@dansitnick
How to get unity_StereoEyeIndex?
Actually in my case i have video in side by side format which I want to project on a plane. I want to crop it to half and depending on unity_StereoEyeIndex if eye is left i want to put left part of video to plane and if its right I want to put right part of video to the plane texture.

Do the custom node the same, but input the plane’s UV instead of the Screen Position. Output that into a Sample Texture node, which samples a Render Texture. Make sure your video player is outputting to that Render Texture as well.

Here are my two tries. Unfortunately both not working for me.
Attached in zip files.
I have been trying for so many days but no luck. Please help on this if possible for you.

4551166–422443–two shaders.zip (4.06 KB)

Got it working. Thanks.

Do you know how to make a custom node for this for Unity 2018.3 where there is no custom node inside shader graph?

Would it be possible to see a screenshot of the graph using this node? Figuring out where exactly to plug it in is proving to be more of a task than I assumed it would be.

2 Likes

You need to connect custom node to UV node of your Texture2D sampler node.

1 Like

What did you do to fix it? could you Please share the shader graph?

2 Likes

This one is very useful and easy :