I’m dealing with this issue right now, in fact (project open in the other window!). Here’s how I’m handling it:
I have two camera rigs in the scene, “DesktopCameraRig” and “OVRCameraRig”. And I have a little script called CrossVRManager that enables one of these, depending on the platform:
public class CrossVRManager : MonoBehaviour {
public GameObject desktopCameraRig;
public GameObject ovrCameraRig;
void Awake() {
#if UNITY_STANDALONE_OSX || UNITY_STANDALONE_WIN || UNITY_EDITOR
desktopCameraRig.SetActive(true);
ovrCameraRig.SetActive(false);
Cursor.lockState = CursorLockMode.Locked;
#elif UNITY_ANDROID
ovrCameraRig.SetActive(true);
desktopCameraRig.SetActive(false);
#endif
}
}
Each of them has an object called LaserPointer. For the desktop rig this is under the CenterEyeAnchor, which also happens to be the main camera. For the OVR (Oculus VR) camera rig, there are actually two of these: one for each hand. (Only one of these will be activated by the Oculus SDK, depending on which hand the user uses for the Go controller.)
The LaserPointer objects have a simple “PositionReticle” class that does just that:
public class PositionReticle : MonoBehaviour {
public Reticle reticle;
public float maxReach = 30;
public LayerMask layerMask;
void LateUpdate() {
Ray ray = new Ray(transform.position, transform.forward);
RaycastHit hit;
if (Physics.Raycast(ray, out hit, maxReach, layerMask)) {
reticle.SetPosition(transform, hit);
} else {
reticle.SetPosition(transform, ray.direction);
}
}
}
And note that in the desktop case, the LaserPointer object is not at the same position as the camera; I put it a little down and to the right, so that I can actually see the beam. Oh yes, and my Reticle class draws a beam (using a LineRenderer).
The desktop rig has a script that just captures the mouse cursor and moves the camera with the mouse (to substitute for moving your head in OVR). My various components that need to know where the laser is pointing, simply GetObjectOfType() and ask it. Oh yes, and to abstract away the trigger, I have this code:
bool CheckTrigger() {
#if UNITY_ANDROID && !UNITY_EDITOR
return OVRInput.Get(OVRInput.Button.PrimaryIndexTrigger);
#else
return Input.GetMouseButton(0);
#endif
}
So on desktop, you click the mouse button; on Oculus, you press the trigger. This gives me a great point-laser-and-click interface that works on both desktop (mainly for testing within Unity) and in VR.