I’ve been having a number of issues with trying to get XR interaction to “feel good” and responsive to poke events using hand tracking.
To help with describing these issues, I’ve created a video highlighting some of these issues: XRIIssues
It appears a lot of these issues boil down to accidental or unwanted pinch events from multiple interactors, specifically the NearFar interactor alongside the poke interactor.
#1) Observe that I’m moving the checkerboard objects around without actually touching them. I’m also sometimes trying to poke them but nothing happens. Am I moving too fast?
#2) Observe that the poke button simply isn’t being pressed. Again, I think the NearFar interactor might be interfering where it’s registering a pinch event due to the way I was holding a pointing finger pose for poking. If I spread my fingers apart so that the highlight changes from blue to yellow-ish, then poking works more easily.
#3) I can press the UI buttons fairly easily. That’s good. I can slide my finger off an UI button and it stays pressed. Not so good.
#4) This is likely the same as #3 but taken to an absurd level with sliding off a button to the next button. It’s actually pretty neat and I could see working as sort of a gameplay puzzle… but I really doubt this is the intended design 
#4a) For some reason different items were being randomly highlighted on the scroll view while I was interacting with the keypad button. Perhaps there’s some other interactor involved (gaze?) but I can’t be sure.
@BlackPete Regarding #1 and #2, are you seeing this behavior with the default scene and interactor settings or have you modified the settings? Also is this video and behavior recorded on device or in the editor?
This is a fresh project with the MR template. I haven’t changed any settings.
I was struggling with poke buttons in my actual project, so that’s why I decided to see if I had the same issues in a clean project.
The video was recorded on device, but the app was played via editor.
For #1 and #2, I’m 75% sure this boils down to the pinch gesture being way too sensitive. Thus it causes the NearFar interactor to become active, which overrides the poke interactor.
I’ve also had the pinch gesture trigger by a hand resting on my desk when I’m trying to use the app with the other hand.
When I’m poking, I generally use a pointing finger gesture, which could be mistaken for pinch, maybe? (Notice the hand is blue in the video, which I think indicates a pinch state?)
So issue #1 isn’t a huge concern for me, but #2 is the real show stopper for me. A poke button should be pokeable. Full stop. Pinch shouldn’t be involved.
Alright, good to know. And what version of the editor and what device are you using? I’ll see if I can reproduce it.
This is using Unity 6000.0.0b1 and a Quest 3.
Other info that might be relevant:
XR Hands 1.5.0
Unity OpenXR Meta 2.0.1
Using Open XR with Meta Quest feature group
Given that this is while playing in editor, my understanding is it uses the Standalone feature set… in which case the only OpenXR features enabled are “Hand Tracking Subsystem” and “Meta Hand Tracking Aim”
Again, these are all default settings so if you created a new project with the MR template, it SHOULD match.
2 Likes
Hey @BlackPete. So I tried to reproduce these issues in a few different editor versions with unmodified samples, and could not get it to demonstrate the behavior you showed out of the box. I was able to recreate roughly similar behavior for #1 by greatly reducing the Press Threshold
on the ReleaseThresholdButtonReader
to near 0 to make pinch select much more sensitive. This can be found on the Select Input GameObject nested under the Near-Far Interactor for each hand. The image below shows the default value (0.8) of Press Threshold, so I would check if you’re seeing this behavior with this Press Threshold value still set to the default value.
I can confirm the Near-Far Interactor Select Input are at the default values of 0.8 and 0.25 for press and release thresholds respectively.
Additionally it seems like once I get into a pinch state, the hand turns blue. As long as the hand stays blue, then stuff will behave oddly (can’t poke the poke button, checkerboard objects stick with the hand, etc.)
I think part of the problem is due to the fact that any finger against thumb counts as a pinch, and if my hand is pointing away from my face, it can register as a false pinch event. Even now, this is still remaining in the pinch (or select) state for me:
@BlackPete Ah, yeah I totally see what you mean. I was able to reproduce that and dug a little bit into it to find a potential solution for you.
If you look at the XRI Default Input Actions asset (found in the XRI Starter Assets sample) and navigate to either XRI Right Interaction or XRI Left Interaction, you will see Select and Select Value actions. Under Select, you will see graspFirm [RightHand Hand Interaction (OpenXR)] and a similar graspValue under Select Value. This grasp action is triggering select to happen when you pinch any finger. One option for your use case would be to remove grasp, and then you should only be getting pinch on the index finger. I’ve highlighted this in the image below.
If grasp is necessary, I would recommend creating some custom action to restrict the grasp action specifically on the pinch select.
1 Like
Hi,
Thanks for looking into this, and yes I can confirm that removing grasp helps a lot with issues #1 and 2!
I suppose grasp is still needed if we had some sort of a wheel that needed to be grabbed and turned to open a bulkhead door but that’s its own problem to solve.
Anyway, just knowing what’s causing the issue in the first place helped to explain so much.
#3 and #4 still exist but that’s a whole different topic 