Update: a related method (TryGetCurrentUIRaycastResult) on the XR Ray Interactor does work for UI elements!
It also seems like TryGetHitInfo is the more generic function to call. If I’m reading it correctly, it will return the position, normal, and more for 3D and UI raycasts.
public class TestPointer : MonoBehaviour
{
public XRRayInteractor rayInteractor;
public GameObject cursor;
private void Update()
{
RaycastResult res;
if (rayInteractor.TryGetCurrentUIRaycastResult(out res))
{
cursor.transform.position = res.worldPosition;
}
}
}
The cursor GameObject is an object in the scene that I’m moving to the position of the UI Raycast Hit.
I’ll proceed with this, but I’m still wondering: does anyone know of a better way? For example, I feel like some part of the UI system is probably tracking this already but I just don’t know the right part to look for.
I am looking to do a similar thing you describe in your initial post. When trying your code, however, it does not seem to recognise “XRRayInteractor” as a variable I can set in the following line.
public XRRayInteractor rayInteractor
I am very new to Unity so guessing I am probably missing something quite obvious but wondered if you could help out?
I’m trying to do something similar. But I’m stuck on the first part itself. How did you get event triggers for UI Canvas using XRRayInteractor. When I try to use SelectEntered Events, it only runs for GameObjects other than Canvas. Please help me.