I’m getting a Render Texture from a second camera and using it in a shader to find some alpha values in the current camera view. In OnRenderImage, I convert the RenderTexture to a simple Texture2D and set the material’s texture to this newly created Texture2D. This works perfectly and I get the effect I want.
However, its really expensive to convert a RenderTexture to a Texture2D every single frame so I would like to just set the material’s texture to the RenderTexture itself. This however, does not give the effect I want and I have no idea why.
What is so different between the RenderTexture’s pixels and the Texture2D’s pixels? Is there an better way to convert the RenderTexture into a Texture2D?
EDIT: Here’s how I’m converting it now:
void OnRenderImage(RenderTexture source, RenderTexture dest)
{
if (mat.HasProperty("_RenderPaintTexture"))
{
RenderTexture.active = source;
Texture2D texture = new Texture2D(source.width, source.height, TextureFormat.ARGB32, false);
texture.ReadPixels(new Rect(0, 0, source.width, source.height), 0, 0);
texture.Apply();
mat.SetTexture("_RenderPaintTexture", texture);
}
}