Handling pointer events on render texture

@ericprovencher but can you at least test it? I can attach the project by creating an issue in the issue tracker.

I mean to know if what I pretend is possible somehow, I’m not asking for worldspace support like world space canvas with ugui but support through a render texture which I see is something supported at least without XR… I mean, the 3D UI Toolkit use case with render texture via keyboard and mouse does work since 2021 with @antoine-unity example: https://discussions.unity.com/t/852531/6

Apart from that I have seen this other example (to try something else) but it does not allow me to interact with 2 controllers at the same time:

https://gist.github.com/RoxDevvv/83215ae2fe45c5e7416521fe1697fb03

It is as if the SetScreenToPanelSpaceFunction method is not called when it should and I get in conflict the left and right controls if I use them at the same time.

Thanks.

Consider giving this asset a try, I purchased it but life got it the way and never had a chance to try it out:

Thanks, I have already seen that asset but I would like to solve it without a third party paid asset that does more things than I need (curvatures, etc…).

Indeed at the moment the SetScreenToPanelSpaceFunction method is not sufficient to correctly support VR. More logic is needed to correctly interpret input devices on these platforms and send them appropriate UI events.

I imagine it might be possible to make it work by firing events into the UI yourself, but we haven’t tried this ourselves as we are already looking at supporting XR with the actual world space implementation that is progress (which doesn’t require the SetScreenToPanelSpaceFunction override).

@antoine-unity but that’s 2 year at least away… I expected some workaround to unlock this use case with 2022 LTS or 2023/6…

“I imagine it might be possible to make it work by firing events into the UI yourself”

Can you explain it more? Thanks.

@antoine-unity it’s been several months since my last post that you didn’t get around to replying to.

I’m really looking forward to removing as much UI as I can from uGUI and redoing it in UIToolkit but if you don’t reply to that post I can’t even try…

Hi there! I am the original poster (I have no longer access to the account which is associated to my university). I initially wrote this for my master’s thesis. Since then things have gone forward, and the thesis is now a (pre-release) extension called XRUI. With the lost account I had totally forgotten about this thread and I never got to thank @antoine-unity for their example which helped me a lot back then! So now I can say thanks in due form :slight_smile:

@bdovaz Maybe you can scroll the code to find what you need. It’s been a while since I dived into this, but I think you may in particular be interested by this:

You can also look at the XR Interactions section and acknowledgements section (see gist by katas94) of the README.

With this, you should be able to use UIToolkit with XRI (or any other XR SDK – that being said, I had some weird offset with using the then Oculus SDK with a Quest 2, unsure if that is still the case) and point to your elements with an AR/VR controller. In my projects I managed to use Quest 2 and 3 controllers, and Hololens 2 + hands.

I hope this helps!