visionOS Beta Release Notes (0.1.2)

We have published a new release (0.1.2) that includes minor bug fixes, and improvements to our samples and documentation. You can find the full list of changes here. Thank you for all the feedback shared thus far; please continue sharing as it has already been helpful in our development process.

Updating Packages
Given the hierarchy change in bundled samples, when you are upgrading your packages from 0.1.0, please remove existing imported samples and re-import to avoid running into “asset not found errors”.

To learn more about Unity’s visionOS beta program, please refer to this post. The following release notes apply to the suite of packages released as part of Unity’s visionOS beta program:

  • com.unity.polyspatial (0.1.2)
  • com.unity.xr.visionos (0.1.3)
  • com.unity.polyspatial.visionos (0.1.2)
  • com.unity.polyspatial.xr (0.1.2)


  • Apple Silicon Mac for development
  • Unity 2022 LTS (2022.3.5f1 or newer)
  • Xcode 15 beta 2

Package Installation

  1. Refer to your email invitation to join the Apple visionOS Beta organization from Unity Technologies.
  2. Log in to your Unity ID account via the provided link in the email.
  3. Under My Account > My Seats, check that you have been assigned an Apple VisionOS Beta subscription. This is a subscription used to manage access to our packages during the beta program. No payment is required to participate in the program.
  4. Launch the Unity Hub, or restart the Unity Hub if it is already open.
  5. Create a new Unity project using Unity 2022.3.5f1, and make sure visionOS Build Support (experimental) and iOS Build Support are installed.
    • Please note that visionOS Build Support (experimental) is only supported on Mac devices.
  6. Open the Package Manager (Window > Package Manager) and make sure Packages: Unity Registry is selected.
  7. Open the “+” dropdown and select “Add package by name…
  8. Add the following packages:
    • com.unity.polyspatial.visionos
    • com.unity.polyspatial.xr
  9. If you’re looking to build windowed or fully immersive apps, to avoid issues with ASTC texture compression issues, you’ll need to replace the VisionOSPlayer in your unity install with the folder in this zip, accessible from the following directory:
    • You’ll also need to remove the quarantine flag on the following native dylib:
      xattr -d VisionOSPlayer/arm64/UnityEditor.VisionOS.Native.dylib

Experimental packages

As a reminder, experimental releases are not supported for production, but they provide early access for those who want to start testing our solutions for the visionOS platform. This also helps us make progress on development through your feedback.

Template and Samples

Template Installation

  1. Download the visionOS Project Template here.
  2. Unzip the file to your desired project location.
  3. Open the project with Unity 2022.3.5f1+ using the Unity Hub > Open Project.
  4. Further information is provided with In-Editor Tutorials in the Template, and the template documentation.

Unity PolySpatial Samples

The PolySpatial Samples are built with the Universal Render Pipeline. If you are not starting with the template project referenced above, please create a new project using the Universal Render Pipeline, or manually add and configure URP to your project. Without this, materials from the samples will appear pink in the editor.

  1. Once you have installed our Unity packages, go to the Package Manager (Window > Package Manager) and select the Unity PolySpatial package.
  2. Under the Samples tab, select Import on the Unity PolySpatial Samples.
  3. Once the samples are installed, open the Scenes folder and add all the scenes to your build settings (File > Build Settings…). Make sure the ProjectLauncher scene is the first one.
  4. Open Project Settings (Edit > Project Settings…), select XR Plug-in Management, and make sure Apple visionOS is selected under the visionOS tab (the furthest on the right).
  5. Select the Apple visionOS setting under XR Plug-in Management and set the Volume Mode to Bounded.
  6. Build your project for the visionOS (experimental) platform.
  7. Open the Xcode project and target the Apple Vision Pro simulator to launch the project launcher scene. From here, you can use the arrows to select and click Play to load a scene.

Please note that the samples included in version 0.1.0 of the PolySpatial package will overwrite your XR project settings, and may introduce issues with builds and XR Management project settings.
To fix this, close the Editor, delete the Assets/XR folder (or revert using version control), and restart the Editor. Everything in this folder should get regenerated with default values. You may need to re-enable loaders and restore settings you had previously.

General Notes
  • Please note that when running apps in visionOS Simulator, the result may be different than the Vision Pro headset. Check out Apple’s guide on running your app in the simulator to learn more.
  • On build, the resulting project is still called “Unity-iPhone.xcodeproj”. (Fix in progress)
  • Only Apple Silicon machines are supported. Support for the x86-64 simulator is coming in a future release.
  • When running in the Simulator, you may need to disable Metal API Validation via Xcode’s scheme menu (Edit Scheme > Run > Diagnostics > uncheck Metal API Validation). (Fix in progress)
  • Some visionOS/iOS player settings are ignored or improperly applied. (Fix in progress)
  • The PolySpatial project validations are under the Standalone, Android and iOS tabs in the Project Validation window (Edit > Project Settings > XR Plug-in Management) (Fix in progress)
Windowed Apps / Virtual Reality
  • Crashes may be seen when some pixel formats are used. (Fix in progress)
  • To receive controller input, you’ll need to use a prerelease version of the Input System package. In your project’s Packages/manifest.json, include:

"com.unity.inputsystem": "[]("

visionOS Shared Space / PolySpatial
  • Objects may flicker in and out of visibility, especially when using a bounded volume. (Platform issue)
  • An application cannot have more than one volume at this time. (Platform issue)
  • Volume camera dimensions cannot be changed dynamically. (Platform issue)
  • Canvas / UGUI can consume lots of CPU and be very slow. (Fix in progress)
  • Unity Particle System effects may not function or look correct. (Fix in progress)
  • Began touch phase will always be true while input (mouse click) is being held down in the simulator (Fix in progress)
  • Text elements might not correctly sort with 3D geometry (Fix in progress)
  • Text and other elements with alpha transparency may show borders and/or incorrect alpha. (Platform issue)
  • Input sometimes does not work, due to input colliders not being created on the RealityKit side. (Fix in progress – enable collision shape visualization in Xcode to see if this is your issue)
  • When using an Unbounded Volume Scene, an empty 2D window appears at app launch (Fix in progress)
Frequently asked questions

Q: My input code is not working and I’m not getting the correct touch phases

A: Make sure you are using the correct touchphase

using TouchPhase = UnityEngine.InputSystem.TouchPhase;

Q: Everytime I launch my app the home menu icons will appear in front of it.

A: This is a known issue. (Platform issue)

Q: Hover component shows up differently depending on the object geometry.

A: This is a known issue. (Platform issue)

Q: My app is not responsive or updating slowly.

A: Check the performance tab in Xcode and see if the usage percentages are near or over 100%. If your app is not hitting performance, the simulator and passthrough video will not slow down or stutter but the objects in your app might.

Tips for getting started
  • Use XCode visualizations to see information about objects in your app

  • Make sure you have the leftmost icon selected in the simulator for interacting with content. Other icons are for navigating the simulator and do not register input.

1 Like

We have just published a new package update, see here for the latest release notes.