Handling pointer events on render texture

Hi there,

I’m currently adapting my UI for VR, so I’m using a render texture on a quad inside the game world to render my UI elements.

I was wondering how I could detect pointer events on the render texture in order to interact with my UI with a VR controller.

Would it be a good approach to convert world coordinates of a RaycastHit on the quad to local coordinates of the texture and then dispatch an event? What about hovering?

Any input appreciated! :slight_smile:

Hi! Can you confirm you are using UI Toolkit and not the UGUI/Canvas system?

Yep!

Are you using the UI Toolkit package?

I’m using the version of UI toolkit that is bundled with Unity 2021.2.

Hi, I have converted a sample from the preview package, which works with 2021.2.

It shows how to render the UI in a texture (through the PanelSettings configuration), how to apply that texture to 3D objects and how to remap coordinates from these object’s surfaces to the UI space (UITextureProjection script).

I can’t comment on the specific of getting VR input to work. You would likely need an EventSystem in your scene with the latest version of the new Input system package (pre-release). As long as their is a configuration for Pointer Events in there, the input should work.

7432208–910313–RenderTextureExample.unitypackage (8.43 KB)

1 Like

Hello! Sorry for bumping this thread, and thanks for the example, it works perfect! Just to make sure, is this the recommended way to implement ingame/world gui?

1 Like

If you need a 3D perspective and other scene rendering features, then yes that is the only and recommended way to go with UI Toolkit at the moment.

2 Likes

I have used your package which worked great with the input system ui module. But when I added XROrigin set it up and changed to XR UI Input Module, it won’t work.
I tried adding a world space canvas and it’s working in VR.
I hope there’s an official way to use UI Toolkit within VR.
For my VR game, the player can use his phone, and building the UI within the phone will be a lot cooler, more realistic and I believe a lot lighter when rendered if UI Toolkit is used.

Actually I do prefer to do that using custom collider at the tip of the index finger. When this collider hits the phone screen (which has the render texture of UI Toolkit) I want to send that event to UI Toolkit. I guess the only way to do it is to write my own event system or extend from the XR UI Input Module, right?

I think you will want to look into this.
8708277--1176417--whack a mole 3d.gif

8708277–1176420–renderTexture example.unitypackage (510 KB)

1 Like

I was really excited when I saw the title that I had found an answer to my question, but this seems to be doing the exact opposite of what I’d like to do.

My whole app is using the UI Toolkit as an overlay, and I was wondering if there’s a way to interact with the 3D world (old UI system for example) through a render texture within my UI Toolkit app?

Then you need to look for CameraTransformWorldToPanel in RuntimePanelUtils to go from world space to something on the ui.

To know which 3d element is “under the mouse” you would use a regular raycast.
To go from the UI to the 3d world, you usually take the mouse coordinate during the interaction and camera.ScreenToWorldPoint

@SimonDufour @antoine-unity we have tested your example and it works perfectly with mouse but in XR with controllers it does not work well because the Vector2 that arrives in the following function appears with very strange negative values:

https://docs.unity3d.com/ScriptReference/UIElements.PanelSettings.SetScreenToPanelSpaceFunction.html

I understand that it is a bug, isn’t it? It seems that in this part of the code that checks the eventSource it only takes into account touch, pen and mouse…And I guess that’s part of the problem.

I can only see this code in 2023.x branches (I am on 2022 LTS) so I don’t know if it is the same or not.

https://github.com/Unity-Technologies/UnityCsReference/blob/1b4b79be1f4bedfe18965946323fd565702597ac/Modules/UIElements/Core/DefaultEventSystem.InputForUIProcessor.cs#L132C50-L132C62

Is there any way to unblock this problem? We have tried not to use that Vector2 and to try to know in every moment which XR controller InputSystem is processing to change the controller pivot transform that processes.

The rest of the missing code is from your own example:

private XRController currentController = null;
InputSystem.onEvent += (eventPtr, device) =>
{
    if (device is XRController controller)
    {
        currentController = controller;
    }
    else
    {
        currentController = null;
    }
};
panel.SetScreenToPanelSpaceFunction(ScreenCoordinatesToRenderTexture);
private Vector2 ScreenCoordinatesToRenderTexture(Vector2 screenPosition)
{
    if (currentController == null) return new Vector2(float.NaN, float.NaN);

    Transform pointerPivot;

    if (XRController.leftHand == currentController)
    {
        pointerPivot = myLeftControllerPivot;
    }
    else
    {
        pointerPivot = myRightControllerPivot;
    }

    Ray ray = new(pointerPivot.position, pointerPivot.forward);

    if (Physics.Raycast(ray, out RaycastHit hitInfo))
    {
        return GetPositionFromHit(hitInfo);
    }

    return new Vector2(float.NaN, float.NaN);
}

What happens to us is that if you have one controller active it works but if you have both at the same time it is as if they were competing in the same frame and only one of them is processed (in our case the right one).

Is there any solution to this problem? Thank you.

Hi bdovaz,

Support for XR in Unity 2022 and below is only available via the InputSystem package. If you add an EventSystem in your scene and add the InputSystemUIInputModule to it, then you should be able to reassign any of you standard input actions to the device that best suits your needs, including, I believe, XR.

Calling the SetScreenToPanelSpaceFunction like you did (ignoring the screenPosition argument) might get you the right XR coordinates with a few fixes, but you would still get events only when the mouse, or touch, or pen, is also moving of clicking, unless you use the InputSystemUIInputModule actions mentioned above. If you do, then you should be able to use the screenPosition argument that was sent to you by the XR input directly.

In terms of having weird negative values, that’s an unfortunate consequence of the y coordinate being encoded from the top of the screen vs the bottom, between different parts of Unity. UI Toolkit being a replacement for IMGUI more than anything else, we adopted the top-left origin, whereas uGUI’s EventSystem being in the GameObject world uses bottom-left. If you read the ScreenCoordinatesToRenderTexture method included in the unityPackage Simon sent you, you should have a good model on where to flip the y axis to get the values you need.

I hope this helps!

Benoit

Have to tested this? I don’t see any thread or post about this anywhere.

@uBenoitA and what if we use XRI? We need to use:

https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.5/api/UnityEngine.XR.Interaction.Toolkit.UI.XRUIInputModule.html

It’s still compatible?

I must say, this goes beyond my expertise. I don’t have XR hardware available to try it. All we do in UI Toolkit is relay the input events we get from the Input System actions. If it’s possible to use XRI with uGUI by adding an EventSystem with the appropriate InputModule component for XR, then in principle that same setup should work with UI Toolkit too.

Unfortunately I can’t help you more than that. If that doesn’t help, maybe you can get some more answers by asking in the Input or the XR forums too?

[mention|wz2GUez4AG5tG2uGh2kYVg==] do you know how I should configure it so that XRI is compatible (through interactors in my case ray / direct) with UIToolkit using [mention|r7fK0CI02BzbUbsm0dLIVQ==] approach? (https://forum.unity.com/threads/handling-pointer-events-on-render-texture.1158272/#post-8708277 ?)) Although you say to ask in the Input or XR forums actually my problem is with UIToolkit and that’s why I ask here. Below I will mention members of the XRI team who I see are active on the forum in case they can help.

Input module of XRI that I am using (default one only changing Active Input Mode to Input System Actions):

9689690--1382342--upload_2024-3-9_10-32-4.png

I have doubt if I have to map something in the UI actions because by default (in the InputActionAsset that comes in the Starter example) only has configured in those UI actions for mouse, pen and touch). But as it has that section of “input devices” with xr input then the doubt arises to me, of course.

And the documentation is not very extensive for this component… It does not go in depth in each option of this component.

https://docs.unity3d.com/Packages/com.unity.xr.interaction.toolkit@2.5/manual/ui-setup.html

cc @VRDave_Unity @ericprovencher @unity_andrewc

No radio silence, answer please.

Unlocking this use case (interaction with XR with UI Toolkit through a render texture) is very important for a lot of people while the real world space support is coming in 2025/2026…

cc @uBenoitA @VRDave_Unity @ericprovencher @unity_andrewc

The screen from:

They are very weird, something has to be going on with the input, that’s why I need help to configure it correctly in case it’s not a Unity bug because as @uBenoitA says he hasn’t really tested it so we don’t know if it works or not.

I’m happy you’re keen on integrating UITK support with XRI. Unfortunately given that we still do not have worldspace UI support officially, it’s not something our team has gotten very far in exploring, and I unfortunately cannot assist with the issue you’re observing.

I will say that I’ve started working with @uBenoitA on exploring how this integration will look like when we do support worldpsace UI, and helping shape the data and input models for 3D interaction with UITK, but it’s very early and we don’t have plans for official support via the render texture route you’ve taken.