We have published a new release (0.3.3) that addresses three issues: a fix for a crash on shader graph loading, an improvement to texture transfers, and a fix for API usage affecting TestFlight submissions. For those who are testing on devices at Apple’s developer labs or via a developer kit, you should only be using the following configuration(also outlined in the “Requirements” section).
- Apple Silicon Mac for development
- Unity 2022 LTS (2022.3.9f1 only)
- The update to Xcode 15 beta 8 only works in the above version.
- Xcode 15 beta 8 (linked to download)
- The Xcode 15 Release Candidate will not work
- visionOS beta 3 (21N5233f) SDK
To learn more about Unity’s visionOS beta program, please refer to this post. The following release notes apply to the suite of packages released as part of Unity’s visionOS beta program:
- com.unity.polyspatial (0.3.3)
- com.unity.xr.visionos (0.3.3)
- com.unity.polyspatial.visionos (0.3.3)
- com.unity.polyspatial.xr (0.3.3)
Please note that there is a known issue with content not appearing when the Splash screen is enabled. Disable the Splash screen in Player Settings > Splash Image.
Requirements
- Apple Silicon Mac for development
- Unity 2022 LTS (2022.3.9f1 only)
- The update to Xcode 15 beta 8 only works in the above version.
- Xcode 15 beta 8 (linked to download)
- The Xcode 15 Release Candidate will not work
- visionOS beta 3 (21N5233f) SDK
Package Installation
- Redeem your coupon code to unlock the Apple VisionOS Beta subscription. This is a prepaid subscription used to manage access to our packages during the beta program. No payment is required to participate in the program.
- Launch the Unity Hub, or restart the Unity Hub if it is already open.
- Create a new Unity project using Unity 2022.3.9f1, and make sure visionOS Build Support (experimental) and iOS Build Support are installed.
- Please note that visionOS Build Support (experimental) is only supported on Mac devices.
- Open the Package Manager (Window > Package Manager) and make sure Packages: Unity Registry is selected.
- Open the “+” dropdown and select “Add package by name…”
- Add the following packages:
- com.unity.polyspatial.visionos
- com.unity.polyspatial.xr
Experimental packages
As a reminder, experimental releases are not supported for production, but they provide early access for those who want to start testing our solutions for the visionOS platform. This also helps us make progress on development through your feedback.
Template and Samples
Template Installation
- Download the visionOS Project Template here.
- Unzip the file to your desired project location.
- Open the project with Unity 2022.3.9f1 using the Unity Hub > Open Project
- Further information is provided with In-Editor Tutorials in the Template, and the template documentation.
Unity PolySpatial Samples
The PolySpatial Samples are built with the Universal Render Pipeline. If you are not starting with the template project referenced above, please create a new project using the Universal Render Pipeline, or manually add and configure URP to your project. Without this, materials from the samples will appear pink in the editor.
- Once you have installed our Unity packages, go to the Package Manager (Window > Package Manager) and select the Unity PolySpatial package
- Under the Samples tab, select Import on the Unity PolySpatial Samples
- Once the samples are installed, open the Scenes folder and add all the scenes to your build settings (File > Build Settings…). Make sure the ProjectLauncher scene is the first one.
- Make sure PolySpatial is enabled in the Player Settings > PolySpatial and make sure Apple visionOS is checked in XR Plug-in Management.
- Build your project for the visionOS (experimental) platform.
- Open the Xcode project and target the Apple Vision Pro simulator to launch the project launcher scene. From here, you can use the arrows to select and click Play to load a scene.
General Notes
- A new Spatial Pointer Device has been added to the input system package for the spatial tap gesture. Previous input API’s are being deprecated.
- The samples have been updated to use the new Spatial Pointer Device.
- There is a known issue with content not appearing when the Splash screen is enabled. Disable the Splash screen in Player Settings > Splash Image.
- Currently an app must be set to either bounded or unbounded. The settings in the scene volume camera must match the XR Plug-in management Apple VisionOS settings. For building the unbounded scenes in the Samples (Mixed Reality, Image tracking) make sure to not include a scene with a bounded volume camera (Project Launcher).
- The SampleScene is the only supported scene in the template project. The Expand View button in SampleScene is not functional.
- When using the visionOS Simulator, you may experience visual issues with any object that has upward facing normals.
- Interacting with the Slider component in the SpatialUI scene of the samples will crash the app.
- There is a known issue in the Samples Object Manipulation scene when dropping an object below the platform. You can exit and replay the scene to fix it.
- Please note that when running apps in visionOS Simulator, the result may be different than the Vision Pro headset. Check out Apple’s guide on running your app in the simulator to learn more.
- On build, the resulting project is still called “Unity-iPhone.xcodeproj”. (Fix in progress.)
- Only Apple Silicon machines are supported. Support for the x86-64 simulator is coming in a future release.
- When running in the Simulator, you may need to disable Metal API Validation via Xcode’s scheme menu (Edit Scheme > Run > Diagnostics > uncheck Metal API Validation). (Fix in progress.)
- Some visionOS/iOS player settings are ignored or improperly applied. (Fix in progress.)
- The PolySpatial project validations are under the Standalone, Android and iOS tabs in the Project Validation window (Edit > Project Settings > XR Plug-in Management). (Fix in progress.)
Windowed Apps / Virtual Reality
- Crashes may be seen when some pixel formats are used. (Fix in progress.)
- To receive controller input, you’ll need to use a prerelease version of the Input System package. In your project’s Packages/manifest.json, include:
"com.unity.inputsystem": " https://github.com/Unity-Technologies/InputSystem.git?path=/Packages/com.unity.inputsystem"
visionOS Shared Space / PolySpatial
- Objects may flicker in and out of visibility, especially when using a bounded volume. (Platform issue.)
- An application cannot have more than one volume at this time. (Platform issue.)
- Volume camera dimensions cannot be changed dynamically. (Platform issue.)
- Canvas / UGUI can consume lots of CPU and be very slow. (Fix in progress.)
- Unity Particle System effects may not function or look correct. (Fix in progress.)
- Input sometimes does not work, due to input colliders not being created on the RealityKit side. (Fix in progress – enable collision shape visualization in Xcode to see if this is your issue.)
- When using an Unbounded Volume Scene, an empty 2D window appears at app launch. (Fix in progress.)
Frequently asked questions
Q: My input code is not working and I’m not getting the correct touch phases
A: Make sure you are using the correct touchphase
using TouchPhase = UnityEngine.InputSystem.TouchPhase;
Q: Everytime I launch my app the home menu icons will appear in front of it.
A: This is a known issue. (Platform issue.)
Q: Hover component shows up differently depending on the object geometry.
A: This is a known issue. (Platform issue.)
Q: My app is not responsive or updating slowly.
A: Check the performance tab in Xcode and see if the usage percentages are near or over 100%. If your app is not hitting performance, the simulator and passthrough video will not slow down or stutter but the objects in your app might.