New Input System Gesture support

@Rene-Damm ,

I have a few questions regarding the development of touch gestures for the new inputsystem.

How will the gestures work exactly?

Is the swipe gesture going to work exactly as an Android swipe so that I can attach an UI burger menu to it?

Will the gestures replace the manual calculation of a swipe thats currently in the old input system?

Too early to say exactly.

Unfortunately, too early for me to offer anything but some vague ideas. There’s the notion that gestures are basically interactions as found in the action system. But that the recognition of these interactions is separate from the interactions itself and that input can be sourced more flexibly than ATM. Such that you can have gestures/interactions fed from platform-specific recognizers as well as from custom-built software recognizers.

How that will be made contextual is still to be decided. In my mind, it’s a separate step. The low-level part of surfacing gestural data and the high-level part of contextualizing them – ideally in a similar way that you can do it, for example, with LeanTouch.

But this is all super vague and probably not very useful. There’s still a couple things to get worked on before gestures (e.g. general action system improvements like a better polling API, stacking of actions, support for setting parameters dynamically, stuff like that).

Yes.

@Rene-Damm Do you have any indication of when the swipe support in the new input system will be available? Just wondering whether to code a manual one myself of hold on to a release that is coming soon (or might have dropped already and I didn’t spot it). Thanks.

2 Likes

it’s not anywhere soon I assume from quick look at github project. They are busy building other stuff