Hello,
I’m using this function provided in the Unity Manual to take a screenshot of a portion of the screen rendered from a second camera: Unity - Scripting API: Camera.Render
The camera is rendering on a render texture.
The result of the function is always a white texture but only in WebGL build; in editor works as expected.
In Webgl Template is already setted config['webglContextAttributes'] = {"preserveDrawingBuffer": true};
public class Example : MonoBehaviour
{
// Take a "screenshot" of a camera's Render Texture.
Texture2D RTImage(Camera camera)
{
// The Render Texture in RenderTexture.active is the one
// that will be read by ReadPixels.
var currentRT = RenderTexture.active;
RenderTexture.active = camera.targetTexture;
// Render the camera's view.
camera.Render();
// Make a new texture and read the active Render Texture into it.
Texture2D image = new Texture2D(camera.targetTexture.width, camera.targetTexture.height);
image.ReadPixels(new Rect(0, 0, camera.targetTexture.width, camera.targetTexture.height), 0, 0);
image.Apply();
// Replace the original active Render Texture.
RenderTexture.active = currentRT;
return image;
}
}
What could the problem be?
Thanks