Hello,
I am trying to implement a full-screen effect using the Fullscreen material target in Shader Graph. This is implemented as a Full Screen Pass Renderer feature in URP, rendering to single-pass instanced stereo on the headset. Meta Quest 3 is the device under test, hooked to the editor via Meta Quest Link, wired.
The rendering workflow works and I do get the shader output information in both eyes, but any shader work involving UV or Screen Position input nodes (and manipulations of their data) result in a situation on the device where the result does not appear in focus. The two eyes render the data as if they were individual screens rather than a stereo pair with parallax.
This image is the Game View shown with both eyes. As one can see, the cube and the ground grid have the proper parallax, but the post effect, show as the rainbow box, are clearly in the center of both eyes, resulting, when viewed in stereo in the headset, of the non-focused result.
It makes sense, seeing that the effect is in consistent coords in the game view that it would be wrong on the device, when viewed in stereo.
In the headset, the inner 1/3 or so of the boxes overlap, just to give you an idea of what the problem looks like on the device.
Other effects that use UV or Screen Position data as input also suffer from the same issue.
My Questions:
Is the Fullscreen target just not compatible with stereo rendering?
Can I adjust the UV or ScreenPos input data per eye to compensate?
Can I inject the material some other way so that the output observes proper stereo rendering?
Thanks in advance for any wisdom!