Unity Meta App Build Issue

I’m currently working on an project intended for the Meta Quest 3 headset, using Unity to develop an application with an immersive UI main menu. This menu is designed to be navigable via controllers, utilizing a laser pointer system for scene selection. I coded the lasers. Testing the project directly on my PC with the headset connected, everything functions as expected: the laser pointers emit from the controllers, allowing for seamless scene selection.

However, an issue arises when I build and use the app. After building the project with Android settings and installing it on my headset through the Meta Quest developer hub, the lasers mysteriously fail to appear, rendering scene selection impossible. This problem persists despite the app’s successful installation and launch, suggesting a discrepancy between the Unity editor’s behavior and the built app’s performance on the device.

  • The built settings are set for Android

  • I have lasers scripted

  • I am using OVRCameraRig

  • I have the lasers scripted to use on OVRControllerPrefab, which is a child of RightHandAnchor

  • The difference in behavior between the Unity editor tests and the actual device execution hints at a deeper issue, possibly related to how the build process handles VR-specific input or rendering settings.

  • I’ve attached images showcasing the current OVR setup and the project settings fix recommendations

I’m reaching out to the community for insights or advice on addressing this issue, hoping that someone with experience in Unity VR development for Meta Quest devices can shed light on this perplexing problem and suggest potential fixes.

Hi, I also have a similar problem in which the build result is different from what I have in the editor. When I run the project directly in the editor, everything works fine, but after build and run, the game would build successfully but all my codes are not running, just giving me an empty scene in the VR headset. I was wondering if you successfully solved your problem. Thank you!

I don’t recommend to use MetaXR plugin as the SDK is constantly changing and it lacks proper documentation making it really confusing to use.

The OpenXR workflow is easier and really stable over the years (XR Interaction Toolkit and XR Plugin Management). You can import the Starter Assets Samples (from the XR Interaction Toolkit) which will gives you a fonctionnal scene with raycasted interactions.