I’m trying to ramp up on PolySpatial and have hit a few problems.
Short version:
I can’t get taps in the ProjectLauncher carousel to be recognized in the simulator. In play mode I can only tap once before all further taps are ignored, and trying to load the first sample results in a camera error.
Installed Xcode 15 Beta 2 (very important! newer versions of Xcode won’t work right now. It would be nice if the System Requirements list were extra explicit about this stating that newer versions are known not to work.)
Installed Unity 2022.3.6f1 including the VisionOS build support beta
Ceated a new Unity Project from the 3D (URP) template
Added com.unity.polyspatial package and restarted the Editor when prompted
Added com.unity.polyspatial.visionos
Added com.unity.polyspatial.xr
Switched the Build Target to Vision OS.
In Project Settings → Poly Spatial ensure that Enable PolySpatial Runtime is checked
In Project Settings → XR Plugin-in Management, under the unnamed tab for VisionOS, ensure that Apple visionOS is selected as a Plug-in Provider.
In Project Settings → XR Plugin-in Management → Apple visionOS, ensure the Device Target is simulator and the Volume Mode is Unbounded.
Load the SampleScene that pre-exists as part of the 3D (URP) template.
Right click in the hierachy view and select XR → Setup → Volume Camera
Select the created VolumeCamera and ensure that Mode is set to Unbounded
Right click in the hierachy view and select 3D Object → Cube to insert a cube into the scene
Edit the properties of the cube to place it at position 0, 1, 1 and scale it to 0.5 to make the cube reasonably visible in default states.
Save the Project
Open Build Settings
Ensure the Sample Scene is the only scene in the list
Build
Open Xcode 15 beta 2 and load the created Unity-iPhone.xcodeproj
Ensure the simulator is the selected target in Xcode
Run
This results in roughly what I expected. A cube floating in front of the camera. So far so good.
Next I try to load the PolySpatial samples into the same Unity Project:
Close Xcode
Open the Package Manager in Unity
Select the Poly Spatial package.
Select the Samples tab in the package details pane
Click the Import button
This appears to import just fine. Next I want to build and run the samples:
Delete the folder holding the Xcode project so that I’m certain I get a clean build
Open Build Settings
Ensure all the new scenes are added to the Scene List and that the ProjectLauncher scene is the first scene in the list.
Build
Open Xcode 15 beta 2 and load the created Unity-iPhone.xcodeproj
Ensure the simulator is the selected target in Xcode
Run
I see the scene spatially, but I’m unable to interact with any of the buttons (neither carousel buttons nor the Play).
Note for others that I also ran into a problem where the Unity scene was rendered in a 2D window instead of Spatially. When I was running into this problem I noted messages in the logs about the visionOS XR plugins not loading correctly. I don’t know how this problem was resolved, but I didn’t run into it again when redoing everything to type up this forum post. I suspect closing the simulator and xcode cleaned things up somehow.
I also note that in Unity’s Play mode I can only press one button one time and after that they stop interacting. So I can either advance the carousel by one, reverse it by one, or hit play. When I hit Play I get a warning about No Cameras Rendering.
Is there a way to get these samples working in the VisionPro simulator?
Is there a way to get these samples working in Unity Play Mode?
Sorry you’ve ran into some issues with the samples, we’re hoping to smooth out some of the first import process and make them a bit more robust.
A current issue with the samples when importing into an existing visionOS / polyspatial project is it will conflict with settings files for XR and PolySpatial (making some UI incorrect and not responsive). In future versions we’ve moved the files and not included settings files, you can also work around this by deleting XR settings and PolySpatial settings then reconfiguring them. Rendering in a 2D window is expected behavior when building for visionOS without PolySpatial enabled.
Make sure the Mixed Reality Volume Mode (In VisionOS XR plug-in management settings) is set to Bounded, there are issues interacting with virtual content in the simulator when it clips with the virtual environment.
This is a known issue with Editor Play Mode input and should be resolved in future versions.
There are a few issues we are still working through to have the samples work better in the editor.
All the bounded scenes (ProjectLauncher, DebugUI, CharacterRunner, Manipulation, BalloonGallery, UI) should work in the simulator when built in Bounded mode
Thank you for the detailed steps and report! We’re hoping to address most of these issues in future package / sample releases.
Confirmed that switching Project Settings → XR Plug-in Management → Apple visionOS’s Volume Mode to Bounded causes input to start working again. Thanks for the pointer!
It would be nice if if upon scene loading the volume mode is automatically switched as appropriate to whatever the volume camera of the scene specified rather than having both a setting in the scene and a global setting which might conflict.
I have also tested flipping each Scene’s VolumeCamera to unbounded mode and set the global value to unbounded as well. This results in the original problem of no input being registered. Is it a known issue that input is not available in the simulator in unbounded mode?
We’re working on a solution for this that should be in future releases. The current implementation is a temporary solution for how we have to construct things on the SwiftUI side.
Currently the global setting in VisionOS XR Plug-in Management is the only camera mode checked when building a project.
Hi,
I’m also having issues with the project samples. I’ve followed the steps in the release notes, basically the same steps that ccrabb has listed above, and the build succeeds and deploys to the simulator. However, when running, I get a number of warnings, and the simulator returns to the Home Screen shortly after trying to load the app. A few moments later, I see a faded view of the sample behind the Home Screen:
Following the steps mentioned to delete Assets/XR and reconfigure has has a small effect of the app no longer appearing faded (background vs foreground?). However, the Home Screen still appears, and if I bring the sample app into focus I can’t get any inputs to work.
The home screen appearing is a known issue. You should be able to click the home icon in the upper right of the simulator to dismiss the icons and then interact with the content.
For Input double check that the settings are in bounded mode and that the content is in an open space in the simulator (not clipping through the floor or table).