📌 General Release Notes

Package Installation

  1. Launch the Unity Hub, or restart the Unity Hub if it is already open.
  2. Create a new Unity project using the latest supported version of Unity 2022.3, and make sure visionOS Build Support (experimental) and iOS Build Support are installed.
    • Please note that visionOS Build Support (experimental) is only supported on Mac devices.
  3. Open the Package Manager (Window > Package Manager) and make sure Packages: Unity Registry is selected.
  4. Open the “+” dropdown and select “Add package by name…”
  5. Add the following packages:
    • com.unity.polyspatial.visionos
    • com.unity.polyspatial.xr

Experimental packages

As a reminder, experimental releases are not supported for production, but they provide early access for those who want to start testing our solutions for the visionOS platform. This also helps us make progress on development through your feedback.

PolySpatial Version Support Summary

PolySpatial package versions Unity Version Xcode Version Device seed version
0.1.0 2022.3.5f1 15 beta 2 1
0.1.2 2022.3.5f1 15 beta 2 1
0.2.2 2022.3.5f1 15 beta 5 2 21N5207g
0.3.2 2022.3.9f1 15 beta 8 3 21N5233f
0.3.3 2022.3.9f1 15 beta 8 3 21N5233f
0.4.1 2022.3.9f1 and above 15.1 beta 4 21N5259k
0.4.3 2022.3.9f1 and above 15.1 beta 4 21N5259k
0.5.0 2022.3.11f1 and 2022.3.12f1 15.1 beta 4 21N5259k
0.6.2 2022.3.11f1 and higher 15.1 beta 4 21N5259k
Template and Samples

Template Installation

  1. Download the visionOS Project Template here.
  2. Unzip the file to your desired project location.
  3. Open the project using Unity Hub and the compatible version of Unity 2022 LTS.
  4. Further information is provided with In-Editor Tutorials in the Template, and the template documentation.

Unity PolySpatial Samples

The PolySpatial Samples are built with the Universal Render Pipeline. If you are not starting with the template project referenced above, please create a new project using the Universal Render Pipeline, or manually add and configure URP to your project. Without this, materials from the samples will appear pink in the editor.

  1. Once you have installed our Unity packages, go to the Package Manager (Window > Package Manager) and select the Unity PolySpatial package
  2. Under the Samples tab, select Import on the Unity PolySpatial Samples
  3. Once the samples are installed, open the Scenes folder and add all the scenes to your build settings (File > Build Settings…). Make sure the ProjectLauncher scene is the first one.
  4. Make sure PolySpatial is enabled in the Player Settings > PolySpatial and make sure Apple visionOS is checked in XR Plug-in Management.
  5. Build your project for the visionOS (experimental) platform.
  6. Open the Xcode project and target the Apple Vision Pro simulator to launch the project launcher scene. From here, you can use the arrows to select and click Play to load a scene.
General Notes
  • The Tutorial that was part of the previous template is not compatible with 0.4.x. Please remove the Tutorial folder under Assets if it comes from a template earlier than 0.4.1.
  • When moving to 2022.3.11f1 editor, you will need to specify whether you are targeting the simulator or device SDK, found in Project Settings > Player Settings > Other. With Unity versions earlier than 11f1, the Target SDK must remain “Device”. In a future release, we will remove the need to choose, and a single project will work with both.
  • The app builds occasionally show up very dark when doing subsequent builds (replace and append).
  • A new Spatial Pointer Device has been added to the input system package for the spatial tap gesture. Previous input API’s are being deprecated. The samples have been updated to use the new Spatial Pointer Device.
  • There is a known issue with content not appearing when the Splash screen is enabled. Disable the Splash screen in Player Settings > Splash Image.
  • Currently an app must be set to either bounded or unbounded. The settings in the scene volume camera must match the XR Plug-in management Apple VisionOS settings. For building the unbounded scenes in the Samples (Mixed Reality, Image tracking) make sure to not include a scene with a bounded volume camera (Project Launcher).
  • The SampleScene is the only supported scene in the template project. The Expand View button in SampleScene is not functional.
  • When using the visionOS Simulator, you may experience visual issues with any object that has upward facing normals.
  • Interacting with the Slider component in the SpatialUI scene of the samples will crash the app.
  • There is a known issue in the Samples Object Manipulation scene when dropping an object below the platform. You can exit and replay the scene to fix it.
  • Please note that when running apps in visionOS Simulator, the result may be different than the Vision Pro headset. Check out Apple’s guide on running your app in the simulator to learn more.
  • Only Apple Silicon machines are supported.
  • When running in the Simulator, you may need to disable Metal API Validation via Xcode’s scheme menu (Edit Scheme > Run > Diagnostics > uncheck Metal API Validation). (Fix in progress.)
  • Some visionOS/iOS player settings are ignored or improperly applied. (Fix in progress.)
  • The PolySpatial project validations are under the Standalone, Android and iOS tabs in the Project Validation window (Edit > Project Settings > XR Plug-in Management). (Fix in progress.)
Windowed Apps / Virtual Reality
  • Crashes may be seen when some pixel formats are used. (Fix in progress.)
  • To receive controller input, you’ll need to use a prerelease version of the Input System package. In your project’s Packages/manifest.json, include:

"com.unity.inputsystem": " https://github.com/Unity-Technologies/InputSystem.git?path=/Packages/com.unity.inputsystem"

visionOS Shared Space / PolySpatial
  • Objects may flicker in and out of visibility, especially when using a bounded volume. (Platform issue.)
  • An application cannot have more than one volume at this time. (Platform issue.)
  • Volume camera dimensions cannot be changed dynamically. (Platform issue.)
  • Unity Particle System effects may not function or look correct. (Fix in progress.)
  • Input sometimes does not work, due to input colliders not being created on the RealityKit side. (Fix in progress – enable collision shape visualization in Xcode to see if this is your issue.)
  • When using an Unbounded Volume Scene, an empty 2D window appears at app launch. (Fix in progress.)
Frequently asked questions

Q: My input code is not working and I’m not getting the correct touch phases

A: Make sure you are using the correct touchphase

using TouchPhase = UnityEngine.InputSystem.TouchPhase;

Q: Everytime I launch my app the home menu icons will appear in front of it.

A: This is a known issue. (Platform issue.)

Q: Hover component shows up differently depending on the object geometry.

A: This is a known issue. (Platform issue.)

Q: My app is not responsive or updating slowly.

A: Check the performance tab in Xcode and see if the usage percentages are near or over 100%. If your app is not hitting performance, the simulator and passthrough video will not slow down or stutter but the objects in your app might.

Tips for getting started

  • Make sure you have the leftmost icon selected in the simulator for interacting with content. Other icons are for navigating the simulator and do not register input.

1 Like

Is it a typo that 0.5.0 needs visionOS beta 4 and not 5? The package versions have followed the beta versions up until now. If this is not a typo, it might create some confusion (and blocked devices since you can’t downgrade the OS).

No; beta 5 isn’t under wide release yet, AFAIK. We’re still testing with beta 4.

I wanted to give a heads up that Apple is requiring all DevKits to be upgraded to the latest seed 5 VisionOS Beta by Tuesday at 3pm. In their words, “Apple is working towards 100% update by EOD Tuesday Pacific time.”… We also have to provide confirmation its been done. The specific build they want on device is “21N5260b”. I appealed the request but it’s likely to be denied since this update is related to security concerns.


Thanks Matt. While we’re still running the latest seed 5 through our full suite of tests, we can report back that there were no issues when building our samples and testing on device. There may still be trouble areas which we’ve not identified yet, will share those if they come up.

1 Like
  • Currently an app must be set to either bounded or unbounded. The settings in the scene volume camera must match the XR Plug-in management Apple VisionOS settings. For building the unbounded scenes in the Samples (Mixed Reality, Image tracking) make sure to not include a scene with a bounded volume camera (Project Launcher).

Is this still the case? At Unite Amsterdam I think Matt from Unity showed an example where you could swap between bound and unbound mode with a radio style button.

I am trying to run the Mixed Reality scene in the simulator, but the app states that ARKit is not supported in the simulator. Is this really the case? This would be a bummer for me, since our app relies heavily on ARKit and we don’t have an actual device for testing purposes.