Hi there, I have found some behaviour of UIToolkit that seems weird to me. In my setup, I have a UIToolkit Button and I can add a click-callback in two ways:
The strange this is that on a resolution change (that triggers the UI layout to be rebuild), only option B fires again, without me clicking on the button and this only happens when I have just clicked on the button (so it seems it has focus, but when I set button.focusable to false, this also happens).
So to be clear:
Register a callback on a button using option B
Click on the button; I get the debug log
Don’t click on anything else
Change resolution (by rotating my phone)
Observe that the event is fired again, giving me another debug log, without clicking on the button
This is unexpected behaviour to me, but maybe I don’t understand how these clickevents are handled.
Can anyone explain this behaviour to me?
EDIT: it doesn’t happen in editor, ONLY in the Android build that I’m testing.
If somebody tries to tell you that a “ClickEvent” should fire due to to a resolution change, then that somebody has to be sent back to the factory for repairs. That’s 100% a bug.
The difference between the two is that clicked does a bunch of extra stuff. Or, rather, it forwards to button.clickable, which does a bunch of extra stuff - it can be set to only accept the left or right mouse button, it can require modifier keys, it sets pseudo states on the button while the mouse is down on it, whatever.
RegisterCallback just sends you an event and expects you to do all the other stuff yourself.
@CodeSmile, thanks for your quick reaction! I do not unregister events, so I don’t think that what you’ve mentioned should have anything to do with it.
With this piece of code, the lambda function gets executed when I click the button (expected) and directly after a resolution change because the ClickEvent is fired again (unexpected):
protected void Start()
{
m_devContainer = developmentUI.rootVisualElement;
var clearButton = m_devContainer.Q<Button>("cleardevlog");
clearButton.RegisterCallback<ClickEvent>(evt =>
{
// clear the dev log
Debug.Log("dev log cleared");
});
}
With this piece of code, I only get the log when actually clicking the button (expected)
protected void Start()
{
m_devContainer = developmentUI.rootVisualElement;
var clearButton = m_devContainer.Q<Button>("cleardevlog");
clearButton.clicked += () =>
{
// clear the dev log
Debug.Log("dev log cleared");
};
}
It feels like a bug to me, but after researching and diving in the code a bit I just became confused more than that I was sure that it was a bug.
I’m experimenting with using ChatGPT to answer my questions and I usually get pointed in the right direction, but sometimes it becomes extra confusing. This is one of the reasons suggested by ChatGPT for this issue:
UI Toolkit Buttons Activate on Focus & Enter
A Button in UI Toolkit is designed to trigger when:
It receives focus and the user presses Enter or Space.
It was already focused before the resolution changed, and the layout rebuild re-triggers the event.
Especially that second one seems questionable to me.
ChatGTP will hallucinate a good reason for a bug being by design. It will pull “answers” from pages showing why something works the way they do, so it will show you a reason for something doing something regardless of wheter that’s true or not. It’s creating sentences by figuring out what word is most likely to show up next, statistically. It doesn’t know anything.
As I said, there exists no sane world where a button should send you a click event as a side-effect of rotating the phone. That’s bogus behaviour that’s clearly not the intention of anyone rational. Report it as a bug!