I am struggling to understand how to create an XR Poke Button from scratch, and what the best practices are.
I’ve tried copying from the existing poke button in the HandsDemoScene included in the MR template, and replacing the visuals with my own mesh, but it appears as if I change anything, it breaks and doesn’t work as expected: Poking in the wrong direction, finger gets sucked into the button (how?), button presses in too deeply, etc.
The most frustrating part is I’ve been able to successfully create a button that behaves the way I want… but then if I wanted to rotate it so it’s oriented in a different direction, it doesn’t work as expected.
So, I want to take a step back and go through the basics: What are the steps that need to be followed to create a poke button from scratch? Trying to copy from an existing example doesn’t work if I don’t even understand what I’m doing or how to set it up.
Questions:
- What is the minimum number of required components?
- Does the hierarchy need to be set up in a specific way? Does the collider need to be on its own game object?
- I assume the mesh needs to be its own game object (right?). Does it need to be a child of a child game object so that the first child is at scale (1,1,1) and the mesh itself can then be scaled differently?
- How do you control how far down the button presses?
- Poke direction in the poke filter controls the poke direction right? Is this local to the mesh, or local to the prefab root, or totally world space?
- Do the interactable, mesh (or mesh parent), and collider all need to be in the same space? And/or same scale?
- Why am I able to poke from any direction even though “Enable poke angle threshold” is enabled?
- Does the collider need to be at (0,0,0) local position, and offset using the collider center? How do I rotate it then?
I probably have other questions but at this point I am super confused. I’ve filed a bug report with a sample project but I expect whichever QA guy reads it will be just as confused
Bottom line: I want a button using a triangle mesh, and when pressed, it presses in by 0.01 where it then stops and registers as a select event.
I’d also like to be able to add it to a prefab, where the prefab itself potentially could be re-scaled to something slightly smaller (for wrist UI) or slightly larger (for a console panel). Is this an unreasonable expectation?
Here’s a screenshot of the setup:
Video of how it actually behaves in practice:
Here’s the most maddening issue: The down button performs exactly how I want it to perform. I don’t even know how I managed to set it up so the button only presses in that far and no more than that.
But if the button is cloned and rotated for the up button, it doesn’t behave the same.
Yet the button in this project is set up exactly the same way as the other project. Why is the behaviour so different?
EDIT:
Just to clarify, yes I’m aware the Y axis in the above videos is pointed in a different direction from the buttons. Here’s what the button’s Y direction looks like. This means the poke direction should be negative Y, right?