EDIT 2: From Unity 5.2+, you can no longer call glReadPixels from the main thread and expect the correct RenderTexture/FrameBuffer to be active. You have to move the call to a Low-level native plugin and bind the framebuffer manually in the plugin before reading the pixels back. For more information on this, see the newest comments to this answer.
EDIT: (Only works for Unity 5.1 and older) All of the below is still correct, but an easier way to implement it, is to just include the OpenGL library directly in C# (no need for building an extra C++ DLL).
[DllImport("opengl32")]
public static extern void glReadPixels (int x, int y, int width, int height, int format, int type, IntPtr buffer);
This might be a little late, but could be useful for someone else.
If the script is attached to the camera which renders to the render texture, you can put your ReadPixels() into
void OnPostRender () {}
Then the required RenderTexture is already active and you don’t need to spend time changing it.
Because Unity probably wants a copy of all textures in CPU memory, ReadPixels can be very slow - especially when dealing with large textures. To get around this, the only solution I could find was to run Unity in OpenGL-mode (add “-force-opengl” to the target path of Unity’s shortcut) and then write a plugin (DLL) which calls the OpenGL-function glReadPixels. With that function it is possible to transfer directly from the currently active frame buffer (RenderTexture) to CPU memory. With this solution it reduces the amount of time it takes to fetch a 1920x1080 RenderTexture to the CPU memory from 54ms to 12ms.
The C/C++ DLL source (place glext.h in the source folder):
#include "stdafx.h"
#include <gl\GL.h>
#include <gl\GLU.h>
#include <stdlib.h>
#include "glext.h"
#pragma comment(lib, "opengl32.lib")
using namespace std;
extern "C" __declspec(dllexport) int GetPixels(void* buffer, int x, int y, int width, int height) {
if (glGetError())
return -1;
glReadPixels(x, y, width, height, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, buffer);
if (glGetError())
return -2;
return 0;
}
Add the function to Unity (C#) like this (place the DLL in Assets\Plugins\
):
[DllImport ("NameOfYourDLL")]
private static extern int GetPixels(IntPtr buffer, int x, int y, int width, int height);
And use it like any other function. The IntPtr buffer
is a pointer to where you want to store your image. I get the pointer from another DLL - you could also get it by
unsafe{
fixed(SomeClass* ptr = &someClass)
{
IntPtr buffer = (IntPtr)ptr;
GetPixels (buffer, ...);
}
}
However, that requires you to allow unsafe code. Alternativly you could probably use some Marshal-function to get a pointer to a byte array.
I am using this to fetch two full-HD renders each frame at ~24 fps and then keying those onto a 3D-SDI video stream using two Decklink hardware-keyers. The bottleneck is the fetching of data from the GPU as it stalls the rendering pipeline. If anyone finds a faster (real-time) solution for this, please let us know =)