Fullscreen effect differences between DX11 and OpenGL

I wrote an edge detection fullscreen shader graph and the only issue is a bit of a different look between DX11 and OpenGL.
I am reading from normal and depth buffers in a custom function node so I assume this is where it differs between the 2 drivers. Are there Unity shader calls that negate such differences or are there known differences that can be handled with shader defines?

(URP 2022 LTS)

Well it appears I should have read Unity - Manual: Writing shaders for different graphics APIs (unity3d.com)
And used

  • Linear01Depth(float z)
  • LinearEyeDepth(float z)

However these are not defined within Shadergraph land so…