Using Touch Input in an Apple Vision Build without the Unity PolySpatial Plugin

Hello everyone,

I’m currently exploring the development possibilities for the Apple Vision platform in Unity for the windowed apps. I’m curious to know if it’s possible to utilize touch input within the game without relying on the Unity PolySpatial plugin. If it is indeed possible, what hand movements or gestures can be detected by the system? Specifically, can it recognize just a finger click or are there more detailed gestures it can pick up?

Any insights or experiences shared would be greatly appreciated.

Thank you in advance!

Hey Paulray, in general, the expectation for windowed apps is that you’d be typically relying on point-and-click interactions. The simplest way to think about this mode is like moving an iOS app into a “Desktop” environment.

Specifically, you can use “Touch Support” provided by Unity’s input system package (com.unity.inputsystem).

Hope this helps to clarify. If you feel that more robust hands support is required with Windowed Apps, please share your feedback here.