Capturing the processed frame from ARKit / ARFoundation

Hey there, I want to get the camera data from my ARCamera in order to stream out what my AR view looks like.

I have currently tested with

void OnEnable()
    {
        cameraManager.frameReceived += OnCameraFrameReceived;
    }

    void OnDisable()
    {
        cameraManager.frameReceived -= OnCameraFrameReceived;
    }


    private void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    {
        CaptureARBuffer();
    }

    // Get Image from the AR Camera, extract the raw data from the image
    private unsafe void CaptureARBuffer()
    {
        // Get the image in the ARSubsystemManager.cameraFrameReceived callback

        XRCpuImage image;
        if (!cameraManager.TryAcquireLatestCpuImage(out image))
        {
            Debug.LogWarning("Capture AR Buffer returns nothing!!!!!!");
            return;
        }

        var conversionParams = new XRCpuImage.ConversionParams
        {
            // Get the full image
            inputRect = new RectInt(0, 0, image.width, image.height),

            // Downsample by 2
            outputDimensions = new Vector2Int(image.width, image.height),

            // Color image format
            outputFormat = ConvertFormat,

            // Flip across the x axis
            transformation = XRCpuImage.Transformation.MirrorX

            // Call ProcessImage when the async operation completes
        };
        // See how many bytes we need to store the final image.
        int size = image.GetConvertedDataSize(conversionParams);

        Debug.Log("OnCameraFrameReceived, size == " + size + "w:" + image.width + " h:" + image.height + " planes=" + image.planeCount);


        // Allocate a buffer to store the image
        var buffer = new NativeArray<byte>(size, Allocator.Temp);

        // Extract the image data
        image.Convert(conversionParams, new System.IntPtr(buffer.GetUnsafePtr()), buffer.Length);

        // The image was converted to RGBA32 format and written into the provided buffer
        // so we can dispose of the CameraImage. We must do this or it will leak resources.

        byte[] bytes = buffer.ToArray();
        StartCoroutine(PushFrame(bytes, image.width, image.height,
                 () => { image.Dispose(); buffer.Dispose(); }));
    }

This is giving me the CPU image which is essentially the raw.camera view, but I want to retrieve the processed frame that has the background and all of the AR effects like human occlusion and meshing within the frame I am retrieving.

Happy to provide more detail, images and the use case,
Thanks to any responses in advance

I hath found a snippet provided by Unity here, where do I put this code, in update or on a callback for AsyncGPUReadback? Also will this provide me with what I need. Thanks

https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.1/manual/index.html

The screenshot you posted will simply copy the GPU ARFoundation background image as it would appear on device without any AR Content (IE Fitting the display but before rendering opaques).

If I am reading this correctly, what you want is the final image. If that’s the case then you can simply do your gpu readback in MonoBehaviour.OnPostRender where you would do an AsyncGPUReadback.Request on the AR Camera’s targetTexture.

Thank you so much for your response.

You are reading correctly and is exactly what I need to implement / capture.

So I am using Universal Render Pipeline (URP) and so am I right in expecting this to work: