I’ve been trying to find out what the current status of XR Interaction Toolkit.
In the most up to date package, there are lots of deprecated API (the entire affordance system), with no clear indication of what the replacement will look like.
It looks like the XRIT github repository was last touched 3 months ago and has bugs like push buttons don’t seem to work (or maybe I’m overlooking dumb). Non functional push buttons break important features such as the settings panel right at the beginning.
I’m hoping to avoid reinventing some wheels (specifically around working with gaze and hands), so some insight into XRIT’s future would be greatly appreciated.
Hey @BlackPete, we are currently getting ready to release 3.0.6 patch fix for XRI in line with the Unity 6 release. These fixes will be rolled into the github (XRI Examples) as well as all of the XR templates accessible from Unity Hub.
However, the push buttons in the XRI Example project should work properly, but they only work for physical pushing, so clicking from a distance was not originally designed to work. We are working on changing this for accessibility reasons as well as just better UX.
In terms of the Affordance System APIs, we are still working on the replacement “Feedback System”, which we are planning to release in preview form in XRI 3.1 within the next couple of months. It’s a significant amount of work, so it has taken longer than anticipated.
Thanks for responding to my post!
Good to know that the push buttons not working from a distance was as designed and it wasn’t something I overlooked Yes, the ability to interact from a distance would be greatly appreciated as it’s a bit awkward trying to lean over my desk and avoiding hitting my monitors while trying to hit a button.
I’m actually still a little confused why an Affordance (or even Feedback, I guess) system even existed – aren’t they just events? As long as all the events are exposed and invoked, we should be able to just add callbacks to play sounds, haptics, etc.?