Made an empty project, added a new camera with a RenderTexture, and tried to save it to disk. Either get a completely black image or an image with a black line as attached below. The image sometimes shows completely black and sometimes as below randomly, with the black area changing in size.
Code is simple and straightforward, and I have tried multiple ways, heres an example of the current one:
async void RunReadPixel(RenderTexture mainTex)
{
RenderTexture.active = mainTex;
// Explicitly set to R8_UNorm on the RenderTexture and here, as mainTex.graphicsFormat gives incorrect format
// I have also tried multiple other formats and mainTex.graphicsFormat with the default RGBA32
var bridge = new Texture2D(mainTex.width, mainTex.height, GraphicsFormat.R8_UNorm, TextureCreationFlags.None);
bridge.ReadPixels(new Rect(0,0,mainTex.width,mainTex.height),0,0);
RenderTexture.active = null;
var bytes = bridge.EncodeToJPG();
await File.WriteAllBytesAsync(SOME_PATH_HERE, bytes);
}
I believe there has been an issue with ReadPixels on Quest/Vulkan for quite a while. Looks like there is already a bug report. Couldn’t hurt to vote on it. For now, the only workaround might be to use OpenGL.
Yes I made this issue, Unity closed it today saying its a texture format mismatch, I tried explicitly setting the format the same on the texture and Textture2D implementation, as well as multiple formats. But it does not work.
That’s too bad. If you want to post a more complete test script (i.e. exactly where and how you are getting the RenderTexture from), I can try it out in my project.
I made a default one through New > RenderTexture in the assets folder and also tried creating one at runtime so I could get the platform specific format. Both had similar results. The code above is essentially the entire script, the rendertexture settings attached below
I took a look at this and ran into the same issues. Rendering into the render texture worked, saving a texture worked, but the ReadPixels call never retrieves valid data. I seem to only get black results. No combination of RenderTexture creation method or texture formats seem to work (either setting it explicitly or getting it from the RenderTexture).
I would like to be able to switch to Vulkan on Quest and this is a nice feature to have so I will look into it a bit more. Let me know if you find a way to get this working and I will do the same.
Yep, all actions on the GPU work fine (Blitting etc), but as soon as you try to get the data to the CPU then you’re out of luck.
Somethings I’ve tried → Adding to coroutine and waiting a frame shows the half black image more often. WaitForEndOfFrame usually always shows a black image. Storing Opaque/Depth RenderTexture in URP settings occasionally makes GetRawTextureData get the full image, although the objects appear transparent/incorrect. Load/store also didnt seem to have an effect.
Late-aquiring swapchain image didnt do anything, neither did decreasing swapchain buffers to 2. I’m not sure which buffer Unity reads from, or how its stored without looking at the source.
Another point that might be benficial: I have made a custom Render Feature that culls and renders overlay objects, since the default overlay render feature (unsurprisingly) doesn’t work at all. The execution timeframe is set to “After Rendering”, and the objects on these layers usually show up in the images correctly while the rest of the image is black. Hence I’m quite certain its not format related.
I didn’t have any luck with ReadPixels() specifically; waiting a frame did sometimes capture some of the image but always partially corrupted as if it was grabbed mid-rendering. I did find a way to write out a render texture manually rendering a (disabled in the scene) camera with a CommandBuffer:
I used this code and it worked perfectly. Thank you for providing it. I am getting three red errors when I use it though:
AsyncGPUReadback - NativeArray should not be undisposable
AsyncGPUReadback - NativeArray does not have read/write access
UnityException: No texture data provided to LoadRawTextureData
The code works fine and I can live with the errors if necessary, just wondered if you knew a way to resolve them. From my understanding, the first error is because you are creating a persistant allocation for “data” and then not deallocating it in the same scope (it is deallocated inside the readbackRequest). The other two errors I have no idea.
No, sorry, I haven’t looked at this since. It was simply a proof-of-concept for the general method so I stopped once I saw actual textured being written to file on the headset. It can take some experimentation to find the way Unity expects you to interact with some APIs as the documentation itself is pretty skeletal.
I’m not sure if this is related but I’m having massive problems getting Vulkan to capture a camera output to a Rendertexture on Meta Quest. Simple setup… camera outputting to a render texture. Drag RenderTexture onto plane. Unity creates standard material. Works fine on OpenGL / DX11. On vulkan for android / meta it always shows a spectrum of blue to red lights representing what looks like motion or depth. I can’t get the normal camera output