Sample projects not working in Unity Editor after updating to 1.0.3

I am running sample scenes in Unity editor 2022.3.17f1 , to make interact with Balloon Gallery , SpatialUI etc, but the click is not working. Is there any update in settings for newer version ?

Make sure you’re on the VisionOS platform. Happened to me as well on upgrade. It switched to desktop platform and input stopped working.

1 Like

Hi, I’m having the same issue. I switched to the visionOS platform, but still nothing. @luispedrofonseca do they work for you?

Input started working correctly for me after switching to the VisionOS platform.
I’ve had issues before that were only solved by deleting the Library folder and loading the project again. Maybe you can try that too?

Just tried that and no dice. I also tried to debug the issue a bit and posted about that here. đź“Ś Official Support for visionOS Now Available in Early Access - #32 by Rstenson

It seems the Polyspatial input system object isn’t getting created for some reason.

I got it working, in App mode if chosen Mixed reality it will work. Previously it was in VR mode , so inputs were not working

Where exactly do you change that because it appears it is current setup for Mixed for me with volume bounded.

Figured it out from your point above. I was missing changing this flag. Input works now. Thanks!

Where can I find the demo projects?

You have to Import Samples under com.unity.polyspatial package.

Thank you for the answer. How do I do that exactly?

After adding com.unity.polyspatial by name , you will find a tab of Samples, you can import from there.

I’m having issues with this scene (and others) recognizing my input.

On this balloon scene, it’s not recognizing my input.

Separately, I’ve loaded the Mixed Reality scene and it starts with the Play to Device player (on my Vision Pro) but no matter what I’ve tried, it doesn’t recognize my hands. I’m sure I’m missing something simple and would really appreciate any help!

I’m getting “No active UnityEngine.XR.ARSubsystems.XRSessionSubsystem is available. This feature is either not supported on the current platform, or you may need to enable a provider in Project Settings > XR Plug-in Management”

I have Apple VisionOS enabled and have initialize hand tracking on startup. (mixed reality, volume or immersive space).

Restarted the project and everything seems to be working, however, interactions in unbounded scenes don’t work via “Play to Device”. You have to build the scenes and deploy on device.