glReadPixels from a RenderTexture

Hi.

I have a camera that renders to a RenderTexture.

I would like to read back that data with glReadPixels. My native plugin gets called with GL.IssuePluginEvent in the OnPostPostRender method of a script attached to the specific camera.

Unfortunately the pixels I am getting back is my main camera view even if I do a RenderTexture.active = myRenderTexture before my native call.

The documentation at Unity - Manual: Low-level native plug-in interface that discusses plugin callbacks on the rendering thread uses the words “the camera”. Either they actually mean “all the cameras” or I am doing something wrong.

Has anyone been able to call glReadPixels from a RenderTexture ? There are many similar questions on the site, all of the ones I’v found are either pre-mutithreaded rendering or are unanswered.

Please help me.
Thanks!

I call it in update but you should be able to call it in any function. Though you won’t always get data back immediately so you should not try to read into a function’s local variables. All I did to test it was download the native plugin example and modify the SetTextureFromUnity and OnRenderEvent functions and add a ReadPixels function (and delete the code for API’s other than OpenGL, etc.) which should make it easy to replicate.

This is the SetTextureFromUnity from RenderingPlugin.cpp

// --------------------------------------------------------------------------
// SetTextureFromUnity, an example function we export which is called by one of the scripts.
static void* g_DataHandle = 0;
static int   g_TextureWidth  = 0;
static int   g_TextureHeight = 0;

extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API SetTextureFromUnity(void* dataHandle, int w, int h)
{
    
    g_DataHandle = dataHandle;
	g_TextureWidth = w;
	g_TextureHeight = h;
    
}

This is the OnRenderEvent from RenderingPlugin.cpp

static void UNITY_INTERFACE_API OnRenderEvent(int eventID)
{
	// Unknown / unsupported graphics device type? Do nothing
	if (s_CurrentAPI == NULL)
		return;

    s_CurrentAPI->ReadPixels(g_DataHandle, g_TextureWidth, g_TextureHeight);
    
}

This is the actual read pixels in RenderAPI_OpenGLCoreES.cpp

void RenderAPI_OpenGLCoreES::ReadPixels(void* data, int textureWidth, int textureHeight)
{
    
    int currentFBORead;
    int currentFBOWrite;
    glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, &currentFBORead);
    glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, &currentFBOWrite);
    
    glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBOWrite);

    glReadPixels(0, 0, textureWidth, textureHeight, GL_RGBA, GL_FLOAT, data);
    
    glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBORead);
    
//    *You can uncomment this and pass in a pointer to a texture2D "textureHandle" and copy data directly into it here*
    
//    GLuint texResult = (GLuint)(size_t)(textureHandle);
//    glBindTexture(GL_TEXTURE_2D, texResult);
//    glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, textureWidth, textureHeight, GL_RGBA, GL_FLOAT, data);
    
}

I don’t know if you can directly read to a Unity Texture2D so I added code that will copy the data (float array) to it.

My test Unity script.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using System.Runtime.InteropServices;
using System;

public class Test : MonoBehaviour {

	[DllImport("RenderingPlugin")]
	private static extern void SetTextureFromUnity(float[] data, int w, int h);
	[DllImport("RenderingPlugin")]
	private static extern IntPtr GetRenderEventFunc();

	public RenderTexture rTex;
	public Texture2D tex;

	float[] data = new float[0];
	int size = 1024;

	// Use this for initialization
	void Start () {

		tex = new Texture2D (size, size, TextureFormat.RGBAFloat, false);
		tex.Apply ();
		rTex = new RenderTexture (size, size, 0, RenderTextureFormat.ARGBFloat);

		data = new float[size * size * 4];
			
	}
	
	// Update is called once per frame
	void Update () {

		ReadBack ();

	}

	void ReadBack () {

		RenderTexture.active = rTex;
		SetTextureFromUnity (data, size, size);
		GL.IssuePluginEvent (GetRenderEventFunc (), 1);

		Color[] colors = new Color[data.Length/4];
		for (int i = 0; i < data.Length; i+=4) {

			colors [i/4] = new Color (data *, data [i + 1], data [i + 2], data [i + 3]);*
  •  }*
    
  •  tex.SetPixels (colors);*
    
  •  tex.Apply ();*
    
  • }*

  • void OnGUI () {*

_ GUI.DrawTexture (new Rect (Vector2.zero, Vector2.one * size), rTex);_
_ GUI.DrawTexture (new Rect (Vector2.right*size, Vector2.one * size), tex);_

  • }*

  • void OnRenderImage(RenderTexture src, RenderTexture dest) {*

  •  Graphics.Blit (src, rTex);*
    
  •  Graphics.Blit(src, dest);*
    
  • }*

}
Although I’m basically only reading the framebuffer here you can invert the colors of rTex (or use a different renderTexture, etc.) to see that it is actually reading from the one set to active in ReadBack(). Also there’s code in there that turns the data array into a texture but this is just to show that it works.

It seems the FBO unity is binding to GL_READ_FRAMEBUFFER_BINDING in OnPostRender is not the same as the one being written to.

When I try to bind the read to the same as the write unity crashes.

static void UNITY_INTERFACE_API OnGLReadPixelEvent(int eventID)
{
	int currentFBORead;
	int currentFBOWrite;
	glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, &currentFBORead);
	glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, &currentFBOWrite);

	// Set the read frame buffer 
	glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBOWrite);

	// Read the pixels
	glReadPixels(readPixelsX, readPixelsY, readPixelsWidth, readPixelsHeight, readPixelsFormat, readPixelsType, readPixelsDestination);

	// Restore the read frame buffer
	glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBORead);

}

Is it at all possible to use glReadPixels ? It seems RenderTextures need to be written to a texture before we can read them. Texture2D does not support reading ARGBInt and even if it did glReadPixels should be a lot faster.

Am I using glBindFramebuffer correctly ? Is there a way to get unity to bind the read framebuffer as well ?

Thanks!

void RenderAPI_D3D9::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight)
{
//@TODO how ?

}

void RenderAPI_D3D12::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight)
{

//@TODO how ?
}

void RenderAPI_D3D11::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight)
{

//@TODO how ?
}