Our game was depending on _CameraDepthTexture in a shader being the depth of the last rendered camera, but in 5.3 the data I am getting when I sample it makes no sense to me. It has completely broken our post process pipeline, since we save off the depth textures of several cameras via Blits during the OnPreRender() phase, which worked fine until this release.
I would really like to know if anyone else has seen this, and if they know of any workarounds.
It does! Unity got back to me. If anyone else runs into this, the fix is that you now use _LastCameraDepthTexture if you want the depth texture of the most recently rendered camera. They say they will be updating the upgrade guide and docs.