I’ve been learning some basic post-processing through YouTube and just came across something very strange. I set up a camera and have it output to a RenderTexture. Then I assign a script to the camera to do simple Blits via OnRenderImage using a Shader I made. I then created a 3D plane with a mesh renderer on it and placed the RenderTexture on it so I could see what it was doing. It works fine so long as I make the output of the unlit ShaderGraph to be opaque, but as soon as I change it and the RenderTexture material to support transparency (unlit\transparent) then no matter what I do, the resulting RenderTexture is completely transparent. It has the color channels set just fine, but the alpha is being forced transparent somehow, even when I make my shader simply output a single color channel and I specifically set 1 as the alpha. However, when I remove the RenderTexture as the camera output, it appears just fine on the screen.
The Blit is nothing special, just the following :
void OnRenderImage(RenderTexture source, RenderTexture destination)
{
RenderTexture renderTexture = RenderTexture.GetTemporary(source.width, source.height, 0, source.format);
Graphics.Blit(source, renderTexture, postEffectsMaterial, 0);
Graphics.Blit(renderTexture, destination);
RenderTexture.ReleaseTemporary(renderTexture);
}
Any ideas? This is something so simple, I feel like I’m missing something obvious.
Help Obi Wan Kenobi… you’re my only hope!