UIToolkit WorldSpace UI with raycast (for VR)

Hi, I downloaded the latest UIToolkit packages and tested the RenderTexture(Runtime) example. It works as it should. It looks great, but only works with the mouse. I’d like to get it to work with a VR controller. For that I need a raycast from a VR controller to the RenderTexture to get the position of the ray on the UI, which I got working, but I’m completely lost on how to make the panel use this position instead of the mouseposition.

Should I be using PointerEventBase? InputWrapper? IMouseEvent? Or somehow expose the position as a Vector2 for the new InputActions?

Anyone got any working examples of this? I’d be set with a GameObject raycast being detected by UIToolkit RenderTexture.

edit: got it working. Turns out it was as simple as replacing the TargetCamera on the example script.

3 Likes

@mautere I’m looking at UI toolkit in Unity 2021 and I see the option for a render texture, but now way to specify TargetCamera. I know that world space UIs is not supposed to be ready, but I’m curious, was it there and the removed?

To best of my knowledge, it’s not ready, as in… there’s no “official support” for world space UIs, but the render texture is a way to get it working for the time being. For that you’ll do a physics ray cast and get the UV coords etc. There’s downloadable samples available (although they’re not VR) in the package manager on the toolkit asset.

1 Like

For the benefits of the internets com.unity.ui/Samples~/Runtime/Rendering/UITextureProjection.cs at 9837807be0ff6bdf12df2e2bed0cbcef7936e3d9 · needle-mirror/com.unity.ui · GitHub

Please stop necroing old threads to advertise your asset. It’s against the forum rules.

I thought this was a win-win but I did miss how old this thread was. Thanks for pointing it out and I’ll pay better attention to that.

Edit: After reviewing the Code of Conduct again, I went ahead and deleted my post on this thread from a few hours ago.

1 Like