Depth texture normals distorted

Hi all,

I’ve been writing some image effects using the cameras generated depth and normals texture.
One thing that I’ve noticed is that the cameras field of view seems to distort the depth values at the sides of the screen…
If anyone wants to see what I mean, load up Unity’s builtin global fog shader and with some random geometry look at the left and right sides of the screen as you turn left and right…
It’s a bit frustrating as my world space positions which are based off the depth texture change, which is screwing up my lighting calculations…

I’ve included a little demo scene with a plane to highlight the problem. It’s particularly noticeable on the right corner… as you rotate the camera around slowly you’ll see the corner get more ‘red’ instead of staying the same color.
And like I mentioned, seeing as my world space postions are based of this… it screws things up!
I’m inclined to think it’s a problem on my end, but Unity’s builtin global fog shader has the same problem, so I don’t know…

1294671–59636–$Depth Problem.unitypackage (12.2 KB)

Any ideas guys? I would very much like to know if this is the ‘correct’ behavior?
Is there a more consistent method to obtain world positions and normals for use in image effects?