Are there any samples or tutorial about using the Oculus touch controller with Worldspace UI’s in Unity?
Not the ‘gaze’ based UI but using the Touch controllers to touch the UI and press a button or use the slider.
Put a box collider trigger over the UI that you’re touching and add another box collider to the touch controller’s “finger tip”. Add a tag to the finger tip called “Finger” and everytime you want something to happen after you touch the UI just add it in an onTriggerEnter script that reacts to objects with the tag “Finger”. That’s the easiest way I can come up with.
Here is one tutorial on this:
https://developer.oculus.com/blog/unitys-ui-system-in-vr/ [Edit: Sorry, as you mention, this is gaze-based!]
Another approach is to actually create fully 3d buttons. This can look nicer and be more usable, depending on your needs. For our project, we did that from scratch by building a dialog class that can easily create haptic, touchable, and depressable 3d buttons on an in-world dialog coordinate system.
Thx for your answer @JPhilipp
The link is refering to a article that is Gaze based. I’m really looking for touching.
BTW i like your approach for the 3d button!