I don’t know if it is a bug, but If I want to have a rendertexture with a transparent background, it seem that Unity use the depth buffer as the alpha channel instead of the frame buffer alpha.
So for example if I set the camera with the usual Overlay setup using clear flags depth only, I get a different result on the Rendertexture than what I see from the camera, this is because If a shader doesn’t write to the depth buffer but write to the frame buffer it doesn’t appear because it’s depth is zero and thus alpha of the texture is zero.
So I have some trees with alpha blending that don’t appear on the texture now. Is there any way to solve it?
Is it a bug or a known limitation?
This has to do with applying the alpha twice. Once when rendering to the RenderTexture and then again when rendering the RenderTexture as an overlay. It has nothing to do with Unity accidentally using the depth buffer as alpha.
Easiest solution is to assume premultiplied alpha when rendering the RenderTexture as overlay. So with blending:
Blend One OneMinusSrcAlpha
No change required to the rendering into the RenderTexture and no separate blend states required for the alpha channel.
You will need the keepalpha and ColorMask RGBA though otherwise it just gets replace with ColorMask RGB and the alpha is completely wiped out.
The seperate alpha blend is optional