I wrote an edge detection fullscreen shader graph and the only issue is a bit of a different look between DX11 and OpenGL.
I am reading from normal and depth buffers in a custom function node so I assume this is where it differs between the 2 drivers. Are there Unity shader calls that negate such differences or are there known differences that can be handled with shader defines?
(URP 2022 LTS)