Hey there!
Well i need some native android elements for my AR app. Its mostly AR, so i wont integrate unity as a lib into a native app. Therefore i need someway to replicate some basic native UI elements for android or ios.
One of them is the “Bottom Sheet”. Its an expandable UI element which can be “dragged” from the bottom to the top of the screen. Its like a slide that slides from the bottom to the top basically.
(And here a Video)
Well the part with what i struggle is the… “touch input” or the events. I hook myself into several PointerEvents to increase the height of the window according to the touch input. The only problem here is… that once the pointer leaves the UI, the drag stopps or gets buggy. Looks like events are bound to the elements themself, which is kinda annoying.
var sheet = rootElement.Q<VisualElement>("BottomSheet");
sheet.RegisterCallback<PointerDownEvent>(evt =>
{
Debug.Log("Down");
_drag = true;
});
sheet.RegisterCallback<PointerMoveEvent>(evt =>
{
if(!_drag) return;
Debug.Log("Move");
var style = sheet.style;
// Init value once... for whatever reason
if (style.height.value.value == 0)
{
style.height = new Length(sheet.layout.height, LengthUnit.Pixel);
}
style.height = new Length(style.height.value.value - evt.deltaPosition.y, LengthUnit.Pixel);
});
// Only gets called within the bottomsheet ?! This ruins everything... registering that callback on the root doesn't help either.
sheet.RegisterCallback<PointerUpEvent>(evt =>
{
Debug.Log("Up");
_drag = false;
});
How would you implement something like that in the UI Toolkit? Is that even possible?