I am looking for opinions on developing a game using Unity XR components versus the Oculus Core Dev Blocks for Unity. I would like to understand the tradeoffs.
Our title will first be developed on and for Oculus, but we need the future option for cross-platform. This means starting with cross-platform or later replacing the rigs, etc. with Unity XR.
Anyone’s thoughts and experience would be greatly appreciated.
You should write an abstraction layer for input and a spawner for VR camera rig prefabs.
What I have in my project is instead of any specific VR camera rig prefab in my scene, I have a CameraRigSpawner prefab that loads a platform specific camera rig from Resources, depending on the active platform (UnityXR, Oculus, SteamVR). When play mode starts the Camera attached to the CameraRigSpawner gets destroyed before the VR camera rig is instantiated (at the position/rotation of the CameraRigSpawner).
Most platforms will work with the UnityXR camera rig, but they may need platform-specific components attached, so it’s convenient to have prefab variants per platform.
It’s hard to succinctly explain the input abstraction layer, but basically since SteamVR isn’t currently compatible with Unity’s input system, you’re going to need to support at least 2 different input systems, unless you go 100% OpenXR. So don’t have code spread throughout your project that depends on any one platform’s input system.