I need to read pixel data from a RenderTexture slice so I can send it over the network. I normally use Blit to read/write to these, but I need to get them out of the GPU and convert the data to a byte[ ] array to send them. It’s not quite working, and when I try to find out why, I’m getting difference answers for the texture content depending on how I read it that I’m hoping someone can explain. Here’s how my RenderTextures are set up:
RenderTexture alphaTextures=new RenderTexture(256,256,0,RenderTextureFormat.RHalf,RenderTextureReadWrite.Linear);
alphaTextures.enableRandomWrite=true;
alphaTextures.dimension=UnityEngine.Rendering.TextureDimension.Tex2DArray;
alphaTextures.volumeDepth=8;
alphaTextures.Create();
When I Blit a RenderTexture to a slice in alphaTextures
, it works just fine. But say I have a byte[ ] array of size 2562562 (RHalf=2 bytes per pixel) and I want to put it in slice index
of alphaTextures
, then the following should work:
int texSize=256*256*2;
byte[] byteTex=new byte[texSize];
for (int i=0; i<texSize; ++i) {
byteTex[i]=100; //some test value
} //for
Texture2D texture=new Texture2D(256,256,TextureFormat.RHalf,false,true);
texture.LoadRawTextureData(byteTex);
texture.Apply(false);
Graphics.Blit(texture,alphaTextures,0,index);
However, no matter what I set byteTex
to, the result in-game is always that the slice seems all 1s (without the Blit, it’s unchanged, so it is really that line). When I try to see the contents of the slice, however, I get different results for each of these, and neither of them matches what I see in-game:
//Option 1
Graphics.CopyTexture(alphaTextures,index,texture,0);
something=texture.GetRawTextureData();
//Option 2
Graphics.SetRenderTarget(alphaTextures,0,CubemapFace.Unknown,index);
texture.ReadPixels(new Rect(0,0,256,256),0,0,false);
texture.Apply(false);
something=texture.GetRawTextureData();
Can anyone tell me how to properly set a RenderTexture slice through a byte[ ] array?