Unity's UI and XR input

Hi,

simple question: Is there any native Unity support for using a world-space canvas UI with controllers based on Unity’s XR namespace? If not, what are the steps to be able to use a controller as I would use a mouse in a normal canvas? I would like to have the controller(s) send clicks and move/drag/drop events to the canvas based on their actual position.

The only definitive source on this I found was this Oculus article (Unity’s UI System in VR), but it is nearly 4 years old and I suspect/hope that some things have changed since then (and, also, I would like to remain SDK-independent and only use Unity).

Philip

1 Like

Hello!
I replied in another thread, but since it was a tangent there, so it makes sense to do the same here.

A generalized solution exists, and it is on it’s way through our testing and shipping processes. Sorry I can’t just drop it here, but stay tuned, and I’ll reply to this thread once it’s available.

2 Likes

Moving this over to your own thread:

Soooo, within 2019 is the plan at the moment, but I don’t make those plans, so they can change.
And it is similar to the Oculus solution you linked, in that it allows 3D tracked devices to trigger world-space UI as if they were mice. So you can use dropdowns, scrollbars, buttons, textboxes, etc… Each tracked device and mouse and touch is treated independently so that you can still use Spectator mode to have both player and spectator UIs independently controllable, or you can have 1 hand using a scrollbar and the other pressing a button.

It is UI-based, so it uses the EventSystem, replaces things like StandaloneInputModule, and has a custom UI Graphic Raycaster for 3D devices.

1 Like

@StayTalm_Unity Thanks for your reply, that sounds pretty good. I think I will go with a minimal custom implementation for now and switch once it is available. Thanks, looking forward to the notification :wink:

I am curious on when this rolls out also,

I wrote a VR Input Module, but it seems to be in a constant battle with the standalone input module, they both kind of work together, but the desktop UI interactions only work when I am constantly moving my mouse.

When watching the event system debug I can see my XR Input Module keeps overriding the normal one, unless I’m constantly moving my mouse, then it gets even weirder when both the VR and Desktop users try to interact with their respective UI’s at the same time.

I managed to get it working for both VR and Desktop UI interactions simultaneously.

I had to manually call Process on the StandaloneInputModule from my XRInputModule because the EventSystem component just grabs the first BaseInputModule found, and calls Execute on it. Bit of a hack but it’s working great.

I also tried this same thing, but instead, inherited from the StandaloneInputModule, but that doesn’t work because of private variables that I can’t access and need to access.

So rewrote an ‘almost’ exact copy of the StandaloneInputModule to work with Unity Generic XR inputs and Desktop.

FYI,

I’m interested in an XR Input Module as well and asked pretty much the same exact question here .
Thanks to @StayTalm_Unity for taking the time to reply to both.

Any idea (best guess) when the module might be available? I’m just trying to decide if I should wait or attempt to write something on my own.

I’m wondering if there’s a better way to tie into the event system for the raycasting into the scene. I know that operating a uGUI control involves a (Graphics) raycast from a camera position, through a screen pixel point into the scene to do an intersection test. If I already have a world space raycaster, I’d like to be able to skip the whole screen, point, camera operation.

Lastly, any speculation on how the new input system or UIElements efforts will affect uGUI down the road?

Hello!
I’m going to do this in reverse order:
UIElements: That is going to be a different endeavor and something I still need to look into.

New Input System: It has it’s own uGui Input Module, also written by me, so they share a lot of the same concepts. That one is public: InputSystem/Packages/com.unity.inputsystem/InputSystem/Plugins/UI at stable · Unity-Technologies/InputSystem · GitHub
The XRInput specific one will look very similar, both are based off of the UIInputModule and device models in that folder. I tried to make it much more easily extensible compared to the StandaloneInputModule. The basic design is that you inherit from the UIInputModule, create ____Model structs, and on a per-frame basis, update those and pass them back down to the UIInputModule in order to convert that into actual UI events. I wanted to separate the Input sources from the actual internal uGUI concepts and event processing, to untangle hooking uGUI up to new sources, and I’m pretty happy with the result. If you want to start with your own, I’d suggest starting from that point.

Raycasters: You are bang on on that one! If you look at the New Input System’s TrackedDeviceRaycaster, I extended out the pointerEventData type, and created a new raycaster that can handle 3D coordinates not connected to any specific camera. It should look really similar to a few other solutions out in the field. The tricky part was to write the extended pointerEventData to not get picked up by the 2D Graphic Raycasters. It does graphics-only (no physics raycaster), but does handle physics and graphics occlusion. That raycaster works well for all uGUI types except the dropdown due to how the dropdown does it’s dropping down effect. But that is being fixed and ported backwards to 2019.1.

ETA: I wanna stress that I’m no authority on this one, but I do know that we are scheduling approximately one month out (it’s packaged with a few other thingamabobs too). That does not take into account business decisions, surprises, etc… that may come up, and so this is a thought and not a promise.

Hope all this info helps!

3 Likes

@StayTalm_Unity , thanks for the information and the hard work. This sounds great.

1 Like

After being told to use Curved UI a LOT, it would be nice to just see World Space Canvas UI working in VR, especially as a lot of courseware for Unity suggests or alludes to using native UI in worldspace with VR is a straight forward endeavour.

Keep us posted please!

2 Likes

Is there any update on this? I keep downloading new alpha versions of Unity in the hopes of seeing this implemented

Still on it’s way.
I wish I could say more, but I cannot, but I’m gonna stay true and ping to this thread once it’s available.
I can say it will be a separate package, so it won’t need a specific Alpha version.

1 Like

@StayTalm_Unity hi, is there any update on this? In addition, will there be any built-in solutions for interacting with UI through a laser pointer (like in SteamVR)?

1 Like

Create a gameobject called line which has the linerenderer for the pointer and a camera that is disabled and has a FOV of 1. then use this code in a custom input module. CasterRight is the camera on the right controller pointer. I do this for both left and right hands. Use this as a replacement for InputModule. I also put a script on each canvas that adds itself to the Canvases list. Most of it is derived from VR Andrew’s tutorial video that explains how to do it.

public Camera CasterRight;
public Camera CasterLeft;
[HideInInspector]
public List Canvases;
public GameObject RightCurrent;
public GameObject LeftCurrent;
GameObject RightPressed;
GameObject LeftPressed;
public PointerEventData RightData { get; private set; }

public PointerEventData LeftData { get; private set; }

private void ProcessRight()
{
if (ControllerManager.Right.GetState() != ControllerState.Unavailable && CasterRight != null)
{
//set camera for canvases to right camera
foreach (Canvas canvas in Canvases)
{
canvas.worldCamera = CasterRight;
}
//reset data
RightData.Reset();
RightData.position = new Vector2(CasterRight.pixelWidth / 2, CasterRight.pixelHeight / 2);
//Raycast
eventSystem.RaycastAll(RightData, m_RaycastResultCache);
RightData.pointerCurrentRaycast = FindFirstRaycast(m_RaycastResultCache);
RightCurrent = RightData.pointerCurrentRaycast.gameObject;

//clear raycast
m_RaycastResultCache.Clear();
//handle hover
HandlePointerExitAndEnter(RightData, RightCurrent);
//handle press
if (ControllerManager.Right.IsPressed(Buttons.Trigger))
{
OnPressRight();
}
//handle release
else
{
OnReleaseRight();
}
}
}

void OnPressRight()
{
//set raycast
RightData.pointerPressRaycast = RightData.pointerCurrentRaycast;
//check for object hit and send down event
GameObject newPointerPress = ExecuteEvents.ExecuteHierarchy(RightCurrent, RightData, ExecuteEvents.pointerDownHandler);
//if no down handler , try and get click handler
if (newPointerPress == null)
{
newPointerPress = ExecuteEvents.GetEventHandler(RightCurrent);
}
if (RightPressed != null && RightCurrent == null)
{
//if we exit the element and still have button pressed send up event
ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerUpHandler);
}
//set RightData
RightData.pressPosition = RightData.position;
RightData.pointerPress = newPointerPress;
RightData.rawPointerPress = RightCurrent;
RightPressed = newPointerPress;//save pressed element for later use when released

}

void OnReleaseRight()
{
ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);

GameObject pointerUpHandler = ExecuteEvents.GetEventHandler(RightCurrent);
if (RightData.pointerPress == pointerUpHandler)
{
//send up event to handler under cursor
ExecuteEvents.Execute(RightData.pointerPress, RightData, ExecuteEvents.pointerUpHandler);
}
if(RightPressed != null && pointerUpHandler == RightPressed)
{
//if we are still over the one we first pressed send click event
ExecuteEvents.Execute(RightPressed, RightData, ExecuteEvents.pointerClickHandler);
Debug.Log("Clicked - " + RightPressed);
}
//clear selected gameobject
eventSystem.SetSelectedGameObject(null);
//reset RightData
RightData.pressPosition = Vector2.zero;
RightData.pointerPress = null;
RightData.rawPointerPress = null;
RightPressed = null;
}

David Watt
Hollow World Games

Hi David,

Thanks for sharing this, could be a good workaround to use in the meantime until the official Unity VR UI package is released. I will have to do some performance testing as I’m wondering what the implications on performance are for using 2 additional cameras rendering the UI at almost all times. Especially on lower end platforms like the Oculus Quest/Go where all the draw calls cost you big time.

cameras are unchecked so they don’t render. They are only used for the raycast.

David Watt
Hollow World Games

1 Like

I followed the tutorial that you linked and got it working fine too, thanks for the tip. However it would obviously be preferable to use the official Unity solution once that is finally released (feels like it’s taking ages).

Having to set the worldspace camera for events is not fun and hopefully the Unity official solution avoids that limitation for left/right hand users of UI pointers. @StayTalm_Unity we wait patiently for your solution to come through the pipeline, please let us know when you can share more as our own projects are being delayed by this.

I know it is taking a long time.
I’m waiting too.
But I will post info once I can.

2 Likes

For anyone wondering what the new XR Interaction package is like, I had a go with it at Unity Unite Copenhagen and passed some feedback on about it, including slow lerping of held object positions (seemingly laggy) and also some issues with UI deactivation once the UI raycaster is not pointing at the UI on buttons etc.

It also seems some work is being done on the Inputs for it as the current state of XR input means you have 3 options - legacy input manager, XR Input in a C# script or the new Input System coming in 2019.3/2020*

There is a sneak peek of it here

2 Likes

thanks robyer1 :slight_smile: if you have any other feedback drop me a line!

yep. we’re going to launch with a simple input implementation to get an initial version out. We’re going to expand the input as soon as I can finish working on it after launch :slight_smile:

1 Like