Getting gestures to work in Unity editor

I’ve created an empty GameObject in my scene, added a Canvas and a Panel on it, then a GazeGestureManager on it - similar to the origami demo. Does this work from the emulator in the Unity Editor? I really want to see this working for demo purposes.

The more I read, I’m actually getting more and more discouraged about this. I do have speech recognition working in the Unity Editor game/play mode, but not keyboard nor gestures (e.g. “air tap”). Seems like maybe an Xbox controller might work, but that doesn’t work for my needs. I need minimally to handle keyboard input in the Unity Editor.

You can hotwire the Input.GetButtonDown system calling the tap event so your delegate gets called. It’s a little hacky, but could work.