I’m currently working on an project intended for the Meta Quest 3 headset, using Unity to develop an application with an immersive UI main menu. This menu is designed to be navigable via controllers, utilizing a laser pointer system for scene selection. I coded the lasers. Testing the project directly on my PC with the headset connected, everything functions as expected: the laser pointers emit from the controllers, allowing for seamless scene selection.
However, an issue arises when I build and use the app. After building the project with Android settings and installing it on my headset through the Meta Quest developer hub, the lasers mysteriously fail to appear, rendering scene selection impossible. This problem persists despite the app’s successful installation and launch, suggesting a discrepancy between the Unity editor’s behavior and the built app’s performance on the device.
-
The built settings are set for Android
-
I have lasers scripted
-
I am using OVRCameraRig
-
I have the lasers scripted to use on OVRControllerPrefab, which is a child of RightHandAnchor
-
The difference in behavior between the Unity editor tests and the actual device execution hints at a deeper issue, possibly related to how the build process handles VR-specific input or rendering settings.
-
I’ve attached images showcasing the current OVR setup and the project settings fix recommendations
I’m reaching out to the community for insights or advice on addressing this issue, hoping that someone with experience in Unity VR development for Meta Quest devices can shed light on this perplexing problem and suggest potential fixes.