Hello everyone,
Background: I am using the XR interaction toolkit to implement some small project. I implemented grabbing and teleport via the XR interaction toolkit which works very well. The XR interaction toolkit is using the new input system, with input actions mapped to buttons on the XR controllers.
Problem: I have another system which I want to include, a hand gesture based system where gestures like grab, pinch, etc. can be used. Since the OpenXR implementation of Unity (XR interaction toolkit) is quite nice, I want to use the same system. The good thing is, that I “just” need to put my gestures as bindings to the input actions. However, the input system does only work with gamepad, keyboard, mouse etc. Isnt there a way that I can have my own bindings which can be used in the inpput system manager? (see attached image)
Possible Solution (which I couldn’t get to work): After some research I found out that I could probably implement my own InputDevice which could show up in the input action manager as bindings. However the code provided by the documentation does not work and I did not find a solution to implement my own InputDevice.
Long story short: Basically I have a “select” input action which is mapped to the “grip button” on the controller. What I want is to have the “select” input action to be triggered by something that is not available in the input action bindings.
Does someone have a solution to this? Thank you very much.