I implemented an image-plane selection technique that ignores depth. For this I am casting a ray from the screen, and whatever object is hit first is selected.
var screenPoint = Camera.main.WorldToScreenPoint(_pinchPos);
var ray = Camera.main.ScreenPointToRay(screenPoint);
RaycastHit hit;
if (Physics.Raycast(ray, out hit, 10, LayerMask))
{
}
Everything works well without the oculus connected, however I dont have an oculus to test if it works with it connected. My question is, does Camera.main.WorldToScreenPoint, and Camera.main.ScreenPointToRay behave the same way when the oculus is connected? I know that certain things dont work with the Oculus such as screenspace ui.
If it doesnt work, how would I achieve the same thing?
i’ve run into the same issue, camera.worldToScreenPoint is returning some unexpected values while using the oculus, making my direction indicator (which is on a screen space camera UI) act weird.
everything works fine without the oculus, meaing the indicator points where it should.
ahh, found a solution… if anyone arrives here from google (like me) here’s what i did:
i was using Screen.width/height for calculations. sadly, while the HMD is active, these still contain the dimensions of the game window, while camera.worldtoscreenpoint returns coordinates in respect to the oculus view dimensions.
changing these to the indicator’s parent canvas’s width/height (form canvas.rectTransform.sizeDelta) fixed the weirdness for me.
(i would be nice to have a reference somewhere as to what is different in vr modes, and maybe there is, and i just don’t know about it?)
I’ve got a very similar issue, and was hoping someone could help.
Trying to show an indicator onscreen that is lined up with a target object if the player has it somewhere in view, using a canvas UI. If not, I will show an offscreen indicator with an arrow. This code 100% works in a 3D environment, but once I enter the XR realm and a world space canvas I get wacky values when I try to do a WorldToScreenPoint.
This is the onscreen code:
private void LateUpdate()
{
Camera cam = null;
cam = gm.XR_Level.mainCam; // Points to the center XR eye
Vector3 viewPos = cam.WorldToViewportPoint(GetCurrentMissionPt()); // CurrentMissionPt is the destination transform
if (viewPos.x > 0 && viewPos.x < 1 && viewPos.y > 0 && viewPos.y < 1 && viewPos.z > 0)
{
iWaypointOnscreen.gameObject.SetActive(true);
iWaypointOffscreen.gameObject.SetActive(false);
gOffscreen.gameObject.SetActive(false);
Vector3 screenPos = cam.WorldToScreenPoint(GetCurrentMissionPt());
iWaypointOnscreen.gameObject.transform.position = new Vector3(screenPos.x, screenPos.y, 0);
}
else
{
iWaypointOnscreen.gameObject.SetActive(false);
iWaypointOffscreen.gameObject.SetActive(true);
gOffscreen.gameObject.SetActive(true);
var tmpVector = cam.gameObject.transform.InverseTransformPoint(GetCurrentMissionPt());
var angleToTarget = Mathf.Atan2(tmpVector.y, tmpVector.x) * Mathf.Rad2Deg;
gOffscreen.gameObject.transform.localEulerAngles = new Vector3(0, 0, -angleToTarget);
}
}