Hi, I downloaded the latest UIToolkit packages and tested the RenderTexture(Runtime) example. It works as it should. It looks great, but only works with the mouse. I’d like to get it to work with a VR controller. For that I need a raycast from a VR controller to the RenderTexture to get the position of the ray on the UI, which I got working, but I’m completely lost on how to make the panel use this position instead of the mouseposition.
Should I be using PointerEventBase? InputWrapper? IMouseEvent? Or somehow expose the position as a Vector2 for the new InputActions?
Anyone got any working examples of this? I’d be set with a GameObject raycast being detected by UIToolkit RenderTexture.
edit: got it working. Turns out it was as simple as replacing the TargetCamera on the example script.
@mautere I’m looking at UI toolkit in Unity 2021 and I see the option for a render texture, but now way to specify TargetCamera. I know that world space UIs is not supposed to be ready, but I’m curious, was it there and the removed?
To best of my knowledge, it’s not ready, as in… there’s no “official support” for world space UIs, but the render texture is a way to get it working for the time being. For that you’ll do a physics ray cast and get the UV coords etc. There’s downloadable samples available (although they’re not VR) in the package manager on the toolkit asset.