I would like to understand how exactly do the ScreenCapture.CaptureScreenshotIntoRenderTexture Unity - Scripting API: ScreenCapture.CaptureScreenshotIntoRenderTexture works under the hood. Because it seems to be fully related to the main display. It works great when your Unity application only has one display. But when you are working with multiple displays, the behavior is different. In the editor, it merges the information between multiple game views, with different displays. And once you build a standalone Windows application, the behavior as far as I’ve seen, is no matter the size you set for the render texture, it is always clamped to the size of the main display.
So, in other words. If you have a fullHD display as your main display, and you connect a 4k display as a secondary display, the code below, would take the screenshot for the secondary display, but clamp it to only the upper left corner of it. So basically only 1/4 of the screenshot, the rest is a black image. And I have seen that this is because fullhd is 1/4 of the 4k secondary display. If I set the main display, and the secondary display to be 4k, the screenshot is taken correctly of the secondary display.
Here multiple questions arise. One is why is this taking the screenshot of the secondary display? (Even when this is what I need, I never specified the display to be used for this). Can you please tell me a solution or workaround for this? I need to be able to take a screenshot of a given display, and obtain a render texture out of it. Without having to add a new camera with a render texture as target, since this is of no use to me, since display 2 sometimes renders the result of 2 cameras. (one per each eye). So basically taking a screenshot of only one camera, would leave away the 3D side-by-side feature of my app… Please see the following example (is a copy-paste from the Unity documentation. I just added the custom display resolution being read on demand by a display by its index):
using UnityEngine;
using System.Collections;
using UnityEngine.Rendering;
public class ScreenCaptureIntoRenderTexture : MonoBehaviour
{
private RenderTexture renderTexture;
IEnumerator Start()
{
yield return new WaitForEndOfFrame();
Vector2 secondDisplayScreenDimensions = GetSizeOfDisplay(1);
renderTexture = new RenderTexture((int)secondDisplayScreenDimensions.x, (int) secondDisplayScreenDimensions.y, 0);
ScreenCapture.CaptureScreenshotIntoRenderTexture(renderTexture);
AsyncGPUReadback.Request(renderTexture, 0, TextureFormat.RGBA32, ReadbackCompleted);
}
void ReadbackCompleted(AsyncGPUReadbackRequest request)
{
// Render texture no longer needed, it has been read back.
DestroyImmediate(renderTexture);
using (var imageBytes = request.GetData<byte>())
{
// do something with the pixel data.
}
}
Vector2 GetSizeOfDisplay(int index) {
// Get the total number of connected displays
int displayCount = Display.displays.Length;
index = (index >= displayCount) ? 0 : index;
return new Vector2(
Display.displays[index].systemWidth,
Display.displays[index].systemHeight
);
}
}