AR Camera Background - Forcing 16x9 ratio


I need to be able to record a 16x9 or 9x16 aspect ratio video without any distortion and independently of the mobile phone's resolution.

Right now, on phones like the IPhone XR with a taller (or wider depending on the screen orientation) aspect ratio than 16x9/9x16 the recoded video is squished either vertically or horizontally.

My app can record in both the horizontal view or the vertical view so the solution needs work for both.

How would I go about doing that?

My guess would be to create my own ARCameraBackground script with some modifications to it but I'm not sure what would they be and I don't want to fork the ARFoundation repository...

Here's the app Snaappy that does that by adding a constraint to the AR camera/background to always fit a 9x16 ratio ( the black zones at the top and at the bottom ).

How can I do the same thing in unity with ARFoundation?


You can crop camera video to the desired aspect ratio by making a copy of the ARKitBackground.shader. The field that is responsible for crop/zoom/rotation is _UnityDisplayTransform:

// Remap the texture coordinates based on the device rotation.
float2 texcoord = mul(float3(v.texcoord, 1.0f), _UnityDisplayTransform).xy;

Then, apply your custom camera material to ARCameraBackground.

Thanks for the quick answer @KyryloKuzyk ! (Big fan haha! I do own your asset "AR Foundation Editor Remote" for testing :) ha! )

Hum... I realize that you basically gave me 99% of the answer but I'm definitely way over my head here with this and can't figure out the proper way to get the 16x9 aspect ratio that I want...

What I understand so far:

==> _UnityDisplayTransform is the camera 4x4 matrix provided by AR Foundation camera background renderer
==> The use of the function "mul" is multiplying the texture(float3) by the provided camera matrix(float4) which stretches the texture to match the camera view I believe.
==> Then at the end, it's only getting the x and y-axis of the resulting matrix then applies it to the texture coordinates

Really not sure how to properly change the texcoord to get that 16x9 or 9x16 ratio based on the ScreenOrientation... My shader skills are almost non-existent.

Also, I do need this thing to be working for Android too so I made a copy of the ARCoreBackground.shader file.

Would I need changed the line 55 for android ?

// Remap the texture coordinates based on the device rotation.
textureCoord = (_UnityDisplayTransform * vec4(gl_MultiTexCoord0.x, 1.0f - gl_MultiTexCoord0.y, 1.0f, 0.0f)).xy;

I'm setting(on awake) the IOS or Android modified material based on the user OS on the ARCameraBackground script customMaterial property so that's easy enough.

1 Like

Yes. It’s basically using the Camera Matrix to transform the UVs for the texture (v.texcoord).

I’m not 100% sure, but I believe that’s Rotating the UVs for the Device Rotation (and flipping the Y-axis to conform with Unity’s Y-axis). You’ll probably not want to replace it, but add your code to it instead (remap to your desired aspect ratio either before or after the rotation).
In order to get the desired aspect ratio: Check what your Screen-Size is, then remap the UVs to fit your ratio (UVs are always in the 0-1 range for the entire screen)
Thus: If your screen is 4:3 (e.g. 1280x1024), and you want the image to take up the full width (resizing the height), you’ll want to map the V-coordinate to match your ratio. You’ll also want to offset it so that it is centered on the screen (normally (0,0) is the bottom left of the screen and (1,1) is the top-right (in UV-space)).

1 Like

I posted with the same issue. As mentioned in the answer above, I have confirmed that the aspect ratio of the background image can be changed by customizing ARKitBackground.shader. However, FOV of the camera seems to be managed by ARCameraManager. In other words, the background image and the 3D model are displayed out of alignment.

I checked the ARCameraManager script. The following code is the Update function of ARCameraManager. It references Screen.Width and Screen.Height. I think this is the reason why I can't change the aspect ratio, but is there any way to deal with it?

        void Update()
            if (subsystem == null)

            m_FacingDirection = subsystem.requestedCamera.ToCameraFacingDirection();
            m_LightEstimation = subsystem.requestedLightEstimation.ToLightEstimation();
            m_AutoFocus = subsystem.autoFocusRequested;

            var cameraParams = new XRCameraParams
                zNear = m_Camera.nearClipPlane,
                zFar = m_Camera.farClipPlane,
                screenWidth = Screen.width,
                screenHeight = Screen.height,
                screenOrientation = Screen.orientation

            XRCameraFrame frame;
            if (subsystem.TryGetLatestFrame(cameraParams, out frame))

                if (frameReceived != null)

What that is doing is setting the Resolution for the RenderTexture that holds the Device Camera's output. I don't believe Unity has access to the zoom/fov of that camera.
In ARCameraBackground.OnCameraFrameReceived, the ProjectionMatrix from the Device-Camera is set to the ARCamera in your scene. Overriding this (e.g. in a LateUpdate/OnBeforeRender) would be your best bet.


Thank you for reply.
I want to try your idea, but I don’t understand the details of it.

What should I override?
In LateUpdate, ARCameraFrameEventArgs and projectionMatrix do not exist.
Is your suggestion to overwrite the camera’s fieldOfView?

In ARCameraBackground.OnCameraFrameReceived the ProjectionMatrix from the Device-Camera is applied to the ARCamera in your scene.

DeviceCamera: The actual physical camera that is on your device
ARCamera: The ‘AR Camera’-GameObject that is in your scene

Unfortunately there is no way to disable this without modifying the ARFoundation-code yourself.
What you can do however, is to override the ProjectionMatrix on your ARCamera with a custom matrix.
As the ProjectionMatrix is overridden by OCFR, you’d need to do this every frame. A LateUpdate- or OnBeforeRender-Method would thus be where you apply this Matrix to your ARCamera (as it runs after the OCFR-method, but before the actual rendering occurs).

1 Like

See the image below.

The selected bit of code is where the ProjectionMatrix is applied to your ARCamera.
You'll want to override this. You can either do that by modifying the ARFoundation-Code (which can be done by adding ARFoundation as a 'local' package, but breaks updating & corruption-checks), or by adding a method that does the overriding AFTER this method runs.

If you have a fully custom matrix, you can just override it. Otherwise you can grab the current matrix from the ARCamera and modify it (based on your needs), then override the current value.

1 Like

Thank you for reply.
Customizing ARKitBackground.shader and your advice are the perfect solution for me.

For reference: Below is the code I tried.

using UnityEngine;

namespace SoftCube
    public class ARCameraAspectManager : MonoBehaviour
        private Camera arCamera = default;

        private void LateUpdate()
            arCamera.projectionMatrix = Matrix4x4.Perspective(arCamera.fieldOfView, arCamera.aspect, arCamera.nearClipPlane, arCamera.farClipPlane);
1 Like

I believe that modifying the projectionMatrix is not the correct path and will not do the trick.
Camera video is drawn by a shader that just stretches the camera texture to the full screen (and applies the _UnityDisplayTransform transformation) regardless of other camera settings.

I think this is the correct line of thought, although someone should invest time to write a math expression for this approach :)

1 Like