Blit, CopyTexture, ReadPixels, LoadRawTextureData confusion

I need to read pixel data from a RenderTexture slice so I can send it over the network. I normally use Blit to read/write to these, but I need to get them out of the GPU and convert the data to a byte[ ] array to send them. It’s not quite working, and when I try to find out why, I’m getting difference answers for the texture content depending on how I read it that I’m hoping someone can explain. Here’s how my RenderTextures are set up:

RenderTexture alphaTextures=new RenderTexture(256,256,0,RenderTextureFormat.RHalf,RenderTextureReadWrite.Linear);
alphaTextures.enableRandomWrite=true;
alphaTextures.dimension=UnityEngine.Rendering.TextureDimension.Tex2DArray;
alphaTextures.volumeDepth=8;
alphaTextures.Create();

When I Blit a RenderTexture to a slice in alphaTextures, it works just fine. But say I have a byte[ ] array of size 2562562 (RHalf=2 bytes per pixel) and I want to put it in slice index of alphaTextures, then the following should work:

int texSize=256*256*2;
byte[] byteTex=new byte[texSize];
for (int i=0; i<texSize; ++i) {
  byteTex[i]=100; //some test value
} //for
Texture2D texture=new Texture2D(256,256,TextureFormat.RHalf,false,true);
texture.LoadRawTextureData(byteTex);
texture.Apply(false);
Graphics.Blit(texture,alphaTextures,0,index);

However, no matter what I set byteTex to, the result in-game is always that the slice seems all 1s (without the Blit, it’s unchanged, so it is really that line). When I try to see the contents of the slice, however, I get different results for each of these, and neither of them matches what I see in-game:

//Option 1
Graphics.CopyTexture(alphaTextures,index,texture,0);
something=texture.GetRawTextureData();
//Option 2
Graphics.SetRenderTarget(alphaTextures,0,CubemapFace.Unknown,index);
texture.ReadPixels(new Rect(0,0,256,256),0,0,false);
texture.Apply(false);
something=texture.GetRawTextureData();

Can anyone tell me how to properly set a RenderTexture slice through a byte[ ] array?

Have not done it in a while but I found it easier to use compute buffers to read render texture data back. Have working code if you want it.

Sure, that could be helpful! Have you also used them to write data to render textures?

EDIT: I’ve been looking into it and one problem I foresee is that my rendertexture is in RHalf format, but as far as I understand half-precision floats are not supported in compute shaders. I see posts on how to solve this for StructuredBuffers (packing a few values into a uint) but I don’t think there’s a solution for the render texture itself, is there? RWTexture2D gets interpreted as a float I believe.

I’ve been experimenting some more with the code I posted, and found that Graphics.Blit when called without a material always fills the target render texture with 1s, regardless of what the source texture is! This is unexpected because I thought it would just copy the texture. Instead I’m now using Graphics.CopyTexture.

Unfortunately that still doesn’t solve all my problems because I’m still having trouble reading byte data from the render texture in the first place. Regardless of the contents of my render texture slice, when I use read option 1 (see OP) every byte has value 205, and when I use read option 2 every byte has value 0. Does anyone know why?

EDIT: Found a way that works, though I don’t know why this works while other things do not. If I Blit the slice I want from the render texture array to a temporary render texture and then make that the render target via RenderTexture.active, then I can read from it successfully. Setting a slice as an active render target with Graphics.SetRenderTarget should also work though, so not sure what I’m doing wrong there!

The test script below creates a render texture and copy’s some data from a array into it.
The data is then read back from the render texture into another array.

I only have shaders for float format render textures with 1-4 channels but its possible to the same thing with half, int, uint and byte formats if I remember correctly.

These are the 2 utility script’s and 2 compute shaders needed.

ReadFloat compute shader
WriteFloat computer shader
CBRead utility script
CBWrite utility script

Add heres a example of write some data into a render texture and the reading it back out.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

using Common.Unity.Utility;

public class Test : MonoBehaviour
{

    //These are the Rad and Write compute shaders attached to fourm post.
    public ComputeShader readShader, writeShader;

    void Start()
    {
        int width = 8;
        int height = 8;
        int channels = 4;
        int size = width * height;
        int stride = sizeof(float) * channels;
        RenderTextureFormat format = RenderTextureFormat.ARGBFloat;

        //Create a render texture to write the data into.
        var tex = new RenderTexture(width, height, 0, format, RenderTextureReadWrite.Linear);
        tex.enableRandomWrite = true;
        tex.Create();

        //Create a buffer to take the test data from the cpu and transfer it to the gpu.
        var write_buffer = new ComputeBuffer(size, stride);
        write_buffer.SetData(CreateTestData(size));

        //Run the shader that will copy the data into the renderer texture.
        //CBWrite is a utility script that is attached to forum post.
        CBWrite.IntoRenderTexture(tex, channels, write_buffer, writeShader);

        //Test data is now in the render texture.

        //Create a new empty buffer to copy the contents of the render texture into.
        //You could reuse the write_buffer but want to make the test clear.
        var read_buffer = new ComputeBuffer(size, stride);

        //Run the shader that will copy the data from the render texture into the compute buffer.
        //CBRead is a utility script that is attached to forum post.
        CBRead.FromRenderTexture(tex, channels, read_buffer, readShader);

        //Create empty array to get data back from compute buffer.
        float[] data = new float[size];
        //Transfer data back from gpu to the cpu and into the array.
        read_buffer.GetData(data);

        //Do what ever you want with the data.
        for (int i = 0; i < size; i++)
            Debug.Log(data[i]);

    }

    float[] CreateTestData(int size)
    {
        float[] data = new float[size];

        for (int i = 0; i < size; i++)
            data[i] = i;

        return data;
    }

}

Im not sure about the other issues you are having with the slicing.

3 Likes

Thanks @scrawk I had not thought of method GetData as you suggest to confirm the 3d render texture has all the correct data! :slight_smile: