Hello, I’m working with a depth camera to recreate a point cloud remotely on another device, to do that I send the depth information (2 textures) over the network, the problem is that when I do this the textures change its format (from RGBAFloat to RGBA32), in consequence my shader is unable to render the entire point cloud. I used this approach to set again the format to the texture but didn’t work, any ideas?
XYZTexture = SetFormat(tmpXYZ, TextureFormat.RGBAFloat);
Texture2D SetFormat(Texture2D source, TextureFormat newFormat)
{
Texture2D newTexture = new Texture2D(source.width, source.height, newFormat, false);
newTexture.filterMode = FilterMode.Point;
newTexture.wrapMode = TextureWrapMode.Clamp;
newTexture.SetPixels(source.GetPixels());
newTexture.Apply();
return newTexture;
}
All the get and set pixel functions are a bit outdated and only work on RGBA32, RGB24, or Alpha8 texture formats. It’s probably doing an implicit conversion to RGBA32 when you call “GetPixels”.
The “GetRawTextureData” and “LoadRawTextureData” should work better with RGBAFloat.
1 Like
Thanks a lot for the response, but at the end I had the same results. This had made me think that the problem is on the server because the texture that I use on server is replicated via RenderTexture and ReadPixels to make it readable because is on the GPU, maybe in this process is where the damage is done.
I did this on the server
byte[] Colorbytes;
Colorbytes = colorTexture.GetRawTextureData(); //Getting RawData from the original texture
Texture2D tmp = new Texture2D(colorTexture.width, colorTexture.height, colorTexture.format, false);
tmp.LoadRawTextureData(Colorbytes);
rawImage.texture = tmp;
and realized that .GetRawTextureData() wasn’t working with these textures because the RawImage desn’t display anything.