I’m trying to assign the position of a button dynamically during runtime.
To get a screen point, one tutorial was using:
CameraTransformWorldToPanel()
Is a little odd since one of the parameters is a panel, where I want to position the element in relation to the screen, not a panel. It appears I’d need to create a wrapper VisualElement that is essentially just the screen rect. (rootVisualElement doesn’t work) But still, the resulting position is incorrect with regard to setting the position of the button transform within that panel.
CameraTransformWorldToPanel() documentation is of no help at all.
After some tinkering, it appears unity is setting the position in addition to any existing absolute position, as opposed to just setting a new value. So they all have to be 0 or auto. This certainly complicates development if every dynamic element has to sit in the upper right corner in edit mode.
Alternately, perhaps I’m not supposed to be setting transform.position and am instead suppose to use some other position related parameter?
hope I understand your problem correctly. You are trying to position UI Toolkit Elements in Runtime with a script?
I don’t think that transform.position is the way to go, since UI Toolkit has it’s (sort of) detached positioning logic through it’s layout system.
Have you tried VisualElement.style.top and VisualElement.style.left ?
Just be careful: You will have different results depending on whether you have an absolute or relative positioned element. In your case it sounds like you need absolute positions, when you just want to be able to arbitrarily position your buttons.
But I still need to center the element at the position, and VisualElement.style.width doesn’t return a value. (And neither does VisualElement.style.width.value)
It appears I need to use resolvedStyle to actually grab values, and I don’t see where this is documented. (Except on the resolvedSytle entry itself which is no good unless I knew about the thing called resolvedStyle to begin with.)
Is there a different spot that explains this that I’m missing? This seems like pretty basic stuff, so the fact that this was so hard to find is troubling. I’m coming from canvases, so maybe there’s a new way to go about this. Putting a button over an object in a scene during runtime can’t be niche usage.
Yeah, It’s really different than how you worked with canvases.
I guess one problem is that UI Toolkit is trying to solve a couple of problems at once.
(NoCode Approach for Designers, Editor UI, Runtime UI, Seperation of Concerns for Layout, Style and Functionality)
If you accept that the primary approach for UI Toolkit Layouting is through Flexbox, then It might be
easier to understand where they are coming from for special usecases like absolute positioning.
Although I totally agree with you that throwing a button on the screen should not be a ‘special’ usecase.
If you want to use Flexbox then stop thinking about your elements in isolation, but rather think about what layout
you have on your screen. For a single button this might be as simple as having a container which is covering the whole
screen (or just a part) and then positioning the button in that container.
For reading the style values you should also be able to look at VisualElement.layout
If you really need arbitrary positioning where the elements should not be influencing each other then just use absolute positioning.
I am covering some of those things on my youtube channel in case you need a deeper look: