Raw pixel rendering to screen in Quest2?

Hello there, im working on a ray-tracing project.

I’m facing a problem, draw the ray-tracing output to mesh is nearly impossible, so i was thinking i could use Graphics.Blit() to put the RenderTexture my ComputeShader is generating, this work just fine on PC and mobile but when i tried to port this to the Oculus Quest 2 something just broke, so i think the problem is the render pass?
Im just guessing but maybe the problem comes when i call the Graphics.Blit() inside the OnRenderImage method? cuz just half of the screen is painted with the ComputeShader output

So my question is

Is there a way to draw pixels directly to the graphics api of the quest 2 using unity?
Or there’s a way to use the Graphics.Blit method to render to both eyes of the device?

i believe what you’re looking for is a Custom Pass , also if you read the OnRenderImage documentation is says

OnRenderImage is not supported in the Scriptable Render Pipeline. To create custom fullscreen effects in the Universal Render Pipeline (URP), use the ScriptableRenderPass API. To create custom fullscreen effects in the High Definition Render Pipeline (HDRP), use a Fullscreen Custom Pass.