I am stuck while trying to get depth information of a frame and really need some help. I have searched similar problems and tried to understand the concept but lost my way, so I will go step by step to make my confusion clearer. Thanks in advance
Firstly, what I am trying to do is accessing depth information of any frame w/ depth buffer.
Render texture is needed to use depth buffer (referring to this answer).
I followed RenderDepth.js as an example script which can be found in ShaderReplacement page, following code lines are also used in this script.
I can allocate a render texture in depth format with
renderTexture = RenderTexture.GetTemporary (camera.pixelWidth, camera.pixelHeight, 24, RenderTextureFormat.Depth);
- I create a second camera and set allocated rendertexture to camera’s destination rendertexture with
cam.targetTexture = renderTexture;
- I rendered my camera with RenderDepth.shader (shader is assigned to the script by the way) with
cam.RenderWithShader (depthShader, "RenderType");
RenderDepth.shader can be found here.
What I am expecting is after rendering camera with shader, camera’s target texture would hold depth buffer of the visible scene. Then, I can save this target texture into a Texture2D and can see a standard depth buffer (white-grey-black colors for far and close objects).
However, I could not get which part is holding depth buffer or how I can access information in that buffer. What I am trying to reach is pixel values which range from 0 to 1 with a nonlinear distribution.
Any help would be greatly appreciated, thanks!