Package Installation
- Launch the Unity Hub, or restart the Unity Hub if it is already open.
- Create a new Unity project using the latest supported version of Unity 2022.3, and make sure visionOS Build Support And iOS Build Support are installed.
- Please note that visionOS Build Support is only supported on Apple Silicon machines.
- Install the XR Plugin Management package.
- Once installed, you can enable visionOS in the XR Plugin Management UI which will allow you to choose to build for Virtual Reality or Mixed Reality.
- Selecting Mixed Reality will automatically install additional required packages. Upon completion of installation, you should see the following packages installed:
- com.unity.polyspatial
- com.unity.xr.visionos
- com.unity.polyspatial.visionos
- com.unity.polyspatial.xr
PolySpatial Version Support Summary
PolySpatial package versions | Unity Version | Xcode Version |
---|---|---|
1.2.x | 2022.3.19f1 and higher | 15.3 |
2.0.0-pre.x | 6000.0.0f1 and higher | 15.3 |
Template and Samples
Template Installation
- Download the visionOS Project Template here.
- Unzip the file to your desired project location.
- Open the project using Unity Hub and the compatible version of Unity 2022 LTS.
- Further information is provided with In-Editor Tutorials in the Template, and the template documentation.
Unity PolySpatial Samples
The PolySpatial Samples are built with the Universal Render Pipeline. If you are not starting with the template project referenced above, please create a new project using the Universal Render Pipeline, or manually add and configure URP to your project. Without this, materials from the samples will appear pink in the editor.
- Once you have installed our Unity packages, go to the Package Manager (Window > Package Manager) and select the Unity PolySpatial package
- Under the Samples tab, select Import on the Unity PolySpatial Samples
- Once the samples are installed, open the Scenes folder and add all the scenes to your build settings (File > Build Settings…). Make sure the ProjectLauncher scene is the first one.
- Make sure PolySpatial is enabled in the Player Settings > PolySpatial and make sure Apple visionOS is checked in XR Plug-in Management.
- Build your project for the visionOS (experimental) platform.
- Open the Xcode project and target the Apple Vision Pro simulator to launch the project launcher scene. From here, you can use the arrows to select and click Play to load a scene.
General Notes
- There is a known device issue where the app builds occasionally show up very dark when doing subsequent builds. To work around in the visionOS simulator, select device and erase all content and settings. .
- There is a known issue with content not appearing when the Splash screen is enabled. Disable the Splash screen in Player Settings > Splash Image.
- The template project should have two supported scenes, SampleScene and SampeSceneUnbounded.
- Please note that when running apps in visionOS Simulator, the result may be different than the Vision Pro headset. Check out Apple’s guide on running your app in the simulator to learn more.
- While using the Play to Device app and connecting to scenes, the host app may freeze up even after exiting playmode in the editor. You may need to force quit the simulator if it becomes unresponsive.
- Play to Device app crashes when closing and reopening host app during active connection.
- For bounded content, in Play to Device, you may see two copies of the scene in game view with PtD connected.
- In the Play to Device and the Simulator, you may experience dark lighting, which should go away if you restart or change environments.
- Currently, the Play to Device app only gets a 1m^3 volume camera for both bounded and unbounded Scenes.
- Particle System particles scale based on dimensions of a Bounded Volume Camera.
- MaterialX conversion warnings appear when adding PolySpatial packages.
- PolySpatial Samples do not change “Import” to “Reimport” when imported.
- Bake to mesh particle materials with an assigned base map texture render as diagonal pink squares in the Play to Device host app. Reassigning the texture at runtime is a workaround.
Windowed Apps / Virtual Reality
- Crashes may be seen when some pixel formats are used. (Fix in progress.)
- To receive controller input, you’ll need to use a prerelease version of the Input System package. In your project’s Packages/manifest.json, include:
"com.unity.inputsystem": " https://github.com/Unity-Technologies/InputSystem.git?path=/Packages/com.unity.inputsystem"
visionOS Shared Space / PolySpatial
- Objects may flicker in and out of visibility, especially when using a bounded volume. (Platform issue.)
- An application cannot have more than one volume at this time. (Platform issue.)
- Volume camera dimensions cannot be changed dynamically. (Platform issue.)
- When using an Unbounded Volume Scene, an empty 2D window appears at app launch. (Fix in progress.)
Frequently asked questions
Q: My input code is not working and I’m not getting the correct touch phases
A: Make sure you are using the correct touchphase
using TouchPhase = UnityEngine.InputSystem.TouchPhase;
Q: Everytime I launch my app the home menu icons will appear in front of it.
A: This is a known issue. (Platform issue.)
Q: Hover component shows up differently depending on the object geometry.
A: This is a known issue. (Platform issue.)
Q: My app is not responsive or updating slowly.
A: Check the performance tab in Xcode and see if the usage percentages are near or over 100%. If your app is not hitting performance, the simulator and passthrough video will not slow down or stutter but the objects in your app might.
Tips for getting started
- Instructions on Installing and managing Simulator runtimes
- Use XCode visualizations to see information about objects in your app
- Make sure you have the leftmost icon selected in the simulator for interacting with content. Other icons are for navigating the simulator and do not register input.