_CameraDepthTexture broken in 5.3 **SOLVED**

Our game was depending on _CameraDepthTexture in a shader being the depth of the last rendered camera, but in 5.3 the data I am getting when I sample it makes no sense to me. It has completely broken our post process pipeline, since we save off the depth textures of several cameras via Blits during the OnPreRender() phase, which worked fine until this release.

I would really like to know if anyone else has seen this, and if they know of any workarounds.

I’d suggest bug reporting it. Sounds like a bug to me.

Bug submitted- https://fogbugz.unity3d.com/default.asp?751897_u82es88h7a8ld79g

Can see it here if anyone is interested.

does it have anything to do with this update

  • Shaders: _CameraDepthTexture is now preserved across calls to RenderWithShader()

It does! Unity got back to me. If anyone else runs into this, the fix is that you now use _LastCameraDepthTexture if you want the depth texture of the most recently rendered camera. They say they will be updating the upgrade guide and docs.

Thanks for everyone who responded.