Input doesn`t work in mixed reality

I am currently working with mixed reality and am interested in using meshing instead of planes. However, I’ve encountered an issue where the input isn’t functioning correctly. I’m experiencing some unusual behavior with the input counter; it sometimes registers as 1 or 2 even when I’m not performing any pinching gestures. Additionally, the VisionHover feature doesn’t seem to work as expected. However, when I disable colliders for all meshing and place some objects, I’m able to select them, although it’s quite difficult. This is because the pinch data is behaving strangely; the interaction position is usually closer to my hand than to the object I’m trying to interact with.

SpatialPointerState primaryTouchData =EnhancedSpatialPointerSupport.GetPointerState(activeTouches[0])

Unity 2022.3.20f1
polyspatial 1.1.4 and 1.1.6
device vision OS 1.0.3

Hi, any updates on this?

let us divide the issue to 3 sub issues

  • Input counter sometimes registers as 1 or 2 even when We are not performing any pinching gestures
    Despite not performing any pinching gestures, the input counter sometimes incorrectly registers 1 or 2. Disabling colliders for all meshing seems to offer a partial workaround, allowing for object selection, although with difficulty.
    inside the Update we receive Touch.activeTouches different then 0 when we expect it to be 0
private void Update()
 {
		var activeTouches = Touch.activeTouches;
 }
  • VisionHover feature doesn’t seem to work as expected
    in bounded mode we got the hover effect on objects we are looking at, in mixed reality the objects does not have this same hover effect.

  • The interaction position is usually closer to our hand than to the object we are trying to interact with

SpatialPointerState primaryTouchData =EnhancedSpatialPointerSupport.GetPointerState(activeTouches[0])

This issue seems to stem from an irregularity with the pinch data, where the interaction position is inaccurately mapped closer to the hand than the intended object.

Hello, any updates on this matter?

I’ve prepared a test project to reproduce the issue. Please investigate it. I’ve built it using the latest available polyspatial version 1.2.3 and its samples.

Focus your attention on modifying just one script. In this scene, you’ll find two spheres, and ideally, their positions should align with where I’m looking. However, this functionality isn’t working as expected.

Changed only UnboundedInputManager where use next information to set position:
primaryTouchData.startInteractionPosition;
primaryTouchData.interactionPosition;
All meshes have mesh colliders in the default layer, and the polyspatial collider object’s layer mask is set to include the default layer.

Project link

Short video
VisionPro_inputs

Please check this. Thank you for your attention to this matter.

Hi everyone,

I’ve updated the project to the new Polysaptial 2.0 Pre-11 and revisited the issue we previously discussed. It seems that the bug is still present. I’ve included two videos to demonstrate the problem.

In the first video, you can see that plane detection and inputs are working correctly, with the yellow sphere appearing by pinch where I’m looking.


The second video demonstrates the same process, but with meshing instead of plane detection. As you can see, the yellow sphere appears near my hand instead of on the mesh.

You can easy reproduce this bug if you will try thi project link.

My configuration:
Unity 600.0.16f1
Polyspatial 2.0 pre11

The same behavior persists in Polyspatial 1.2.3.

Thank you in advance.

I see what you mean, but it’s not entirely clear to me what we can do differently to prevent this, or whether it’s all on Apple’s end. It’s interesting to add the Vision OS Hover Effect to the Assets/ExampleAssets/Prefabs/AR Mesh.prefab (with a highly visible color/intensity), which will show you what Apple thinks you’re looking at. It only occasionally highlights the meshes, which suggests to me that maybe there’s something in the way of the raycast it’s performing. On the other hand, if you physically touch the AR meshes, you’ll see the highlight and get the appropriate touch position. It might also have something to do with the way that Apple handles non-convex collision meshes (for which we use generateStaticMesh).

Anyway, as with the other thread, please submit these projects through the bug report mechanism and let us know the incident number (IN-#####), rather than posting download links here. That way, we can track the issue in our system.

Hi @kapolka,

Thank you for your response. I’ve submitted the bug report under “CASE IN-83398.” Please take a look when you have a chance. I understand that the issue might be on Apple’s side, but I appreciate your assistance regardless.

Thank you again.

1 Like