Display stretched/distorted when running on Device; looks fine in Simulator

We are building a Fully Immersive visionOS app using Unity.

The app runs as expected when we test in the Xcode Simulator. However, when we build and run on our actual Vision Pro device, the output is stretched and distorted.

The camera tracks the movement of the headset as expected, but it looks as though it’s being displayed through a heavy fisheye lens. It’s quite nauseating.

Our buttons can sometimes be tapped with the look-and-pinch gesture, but only if the button is perfectly centered in your view.

Simulator vs Device screenshot comparison, looking at the same rectangular canvas menu:

The scene was initially created by referencing the VR Sample in Unity’s “Apple visionOS XR Plugin” package. I’ve referenced other sample scenes and tried copying all of their XR Origin / Camera / Player Settings, etc. These sample scenes display fine in the Vision Pro headset, but no matter which settings I adjust, my project still comes out distorted.

Would greatly appreciate any leads on what might be causing this! Thanks!