📌 Welcome to Unity's visionOS Community Space

Welcome to Unity’s visionOS community on Unity Discussions. We’re thrilled to invite you to learn more about Unity’s tools for visionOS. You can use the visionOS category to share feedback, ask for help, and have discussions about Unity’s support for Apple’s visionOS platform.

We plan to communicate almost entirely through the Discussions platform category, so please set your alerts to be notified.

With Unity’s support for visionOS, we’re excited to bring you familiar workflows and powerful authoring tools to create immersive games and apps for Apple Vision Pro, Apple’s new spatial computing platform. There are three main approaches for creating immersive spatial experiences on the visionOS platform with Unity.

  1. Port an existing virtual reality game or create a new fully immersive experience, replacing the player’s surroundings with your own environments.
  2. Mix content with passthrough to create immersive experiences that blend digital content with the real world.
  3. Run multiple immersive applications side-by-side within passthrough while in the Shared Space.

Developers can also build windowed applications, content that runs in a window which a user can resize and reposition in the shared space. This is the easiest way to bring existing mobile and desktop applications to visionOS, and is the default mode that content will run on when targeting the visionOS platform.

Unity’s visionOS Support Availability
Unity’s visionOS support is available for all subscribers on Unity Pro, Unity Enterprise and Unity Industry. Subscribers can download the visionOS support packages directly from the package manager and start building experiences for the Apple Vision Pro device.

If you’re eligible and on Unity Personal or Unity Plus and you wish to access Unity’s support for visionOS, get started with a free 30-day Unity Pro trial. If you’re using Unity Personal or Unity Plus, please ensure you are Tier Eligible based on Unity Editor Software Terms.

The short video above will help you understand and leverage the features of visionOS with the following packages:

  • AR Foundation and ARKit XR Plug-in - Enabling features like device tracking, plane detection, image tracking and hand tracking
  • XR Hands - Provides a full array of tracked joints with their position and rotation if your app requires information about a user’s hands or custom gestures.
  • XR Interaction Toolkit - Provides high level, component-based interaction systems to implement in your project.
Getting started

To get started, please refer to the pinned release notes for details on device requirements, samples, in addition to template and package installation steps. Our documentation provides an introduction to Unity’s support for visionOS, and will guide you through critical elements like setting up your environment, pre-requisites and building your first app for visionOS.

Reporting bugs
Before submitting a bug, check if it’s already been reported here, otherwise, your bug may be closed as a duplicate if a similar one exists. Please report bugs via the Unity Bug Reporter in the Unity Editor, and provide as much context as possible so we can rapidly triage the issue. When submitting a bug, please:

  • Include “visionOS” or “PolySpatial” in the title for easy discoverability.
  • Link to a hosted image or video that clearly shows or explains the issue, including expected vs. actual results.
  • Attach a (stripped) project so we can easily recreate your issue.
  • Attach Profiler .data and Profile Analyzer .pdata files where possible.
  • After filing your bug, open a Discussions topic if you believe the community would benefit. In the topic name, it helps to use your bug name and issue # to make it easier to identify

To facilitate the collection of feedback, please refer to our roadmap where you can learn about what’s planned moving forward, and tell us more about your experience or share your ideas with us. Your feedback will be carefully reviewed by the team, and provides a significant impact in driving the direction of the product. We look forward to hearing from you.

Posting topics

Please keep your new topics specific and try to ensure you are not duplicating an existing topic. Here are some topic themes to guide you:

  1. Features and workflows: Are there any workflows that are unclear or missing? Are there any features that you expected to see, but didn’t? Can you share a clever workaround for an issue that you think may help others? To submit a direct feature request to Unity instead of posting a topic, you can submit your request here.
  2. Ease of use: Is it easy for you to use our solutions even if you may not be an expert in XR?
  3. Performance: Are there performance issues you’re running into, and in what context?
  4. Use cases: What are your use cases? Can you share a video demo of what you’re working on? We understand models, games, and other assets may be proprietary, so please do not share any sensitive intellectual property (IP) publicly.
  5. Bugs: Is there a bug that you reported through the bug reporter that’s worth discussing with the community? (See below for specifics about submitting bugs.)
  6. Documentation: Is any of the documentation unclear? Would you like more documentation for specific items?

Here are all the official resources for the Unity’s visionOS support.

Frequently asked questions

What is Unity Discussion? Is “Unity Forum” still live?
Unity Discussions is a brand new community platform that was launched recently in public beta to replace Unity Answers and is also intended to replace the Unity Forum in the future. The Unity Forum will remain live until we are ready to transition them to Unity Discussions. Read more about the transition.


Please comply with the basic policies we have set up to ensure a smooth beta program for you and for Unity.

  1. Prototyping & private distribution only: The beta versions of this package are for prototyping and private (internal and non-commercial) distribution only, and your access is subject to our Terms of Service (TOS). This helps protect your work from breaking changes that can occur during the beta cycle. In particular, please review section “13. Evaluation Versions” in our TOS and ensure you understand and can comply with these terms before continuing in the program.

  2. Code of Conduct: The general Code of Conduct applies to this category on Unity Discussions. Please familiarize yourself with it to ensure productive and inclusive conversations.


Do we need to do anything else in order to be able to post a topic here? I can see and reply to posts in visionos beta, but for some reason it won’t let me post a new topic :frowning:

When posting a new topic, you must select one of the three categories:

  • Mixed Reality (Immersive) Apps
  • Virtual Reality (Fully Immersive) Apps
  • Windowed Apps

Thank you! :pray:


How to simulate hand input, clicking, remote operation, etc. in editing mode, and what specific shortcut keys are used

Unity Pro member here. Do you still need to apply in order to build and publish VisionOS apps? All the documentation online says “Apply here” and goes to this thread.

I am in the final stages of porting a VR app to VisionOS, but am unsure of the release process because all documentation states: “Experimental do not use for production”. Tangentially, I am publishing a SwiftUI iOS app to VisionOS and have submitted to the VisionOS AppStore.

Curious to know what the release process is. Can we simply submit our built Xcode project? Or are there other steps we’re supposed to take?

Thanks for all the amazing tools!

1 Like

Is microphone input supported? If not, will it be, if so, when? If never, will there be docs how to work around it with ‘native’ API’s?

Yes same question here:
even when pop up arrives to ask for permission to use the microphone and validates it further calls to microphone initialization give this error on the device:
AURemoteIO.cpp:1702 AUIOClient_StartIO failed (561145187)
which then throws a generic error related to mic init:
"An error occured trying to initialize the recording device. " (70)

Hello Jono! Yesterday we released our latest supported packages which you can use to submit content to the Apple App Store. Please refer to our release notes for more information. With a Pro subscription, you will be able to get access to the packages via the package manager, or via the recommended installation instructions, no additional sign-ups are required.

Hi @DaveA_VR and @cliv3dev

(Updated) We do in fact support the microphone in MR, VR and Windowed mode apps both on device and in simulator. In these modes, the microphone should work pretty much the same as it does on iOS.

(The prior version of this post erroneously stated we did not support the microphone).

Hope that helps!

ARKit hand tracking is not supported in the simulator. You can simulate tracked hands in the the editor using the XRI device simulator and start to build with that since it leverages the hands package.

The spatial tap gesture is supported, if you get the PolySpatial package samples you can see how we handle that. Those should work in the editor during play mode.

Thank you for the reply. I’ve been following the thread since you first posted it. Updating to the latest package now. Thank you again!

Were you able to solve this mic problem? I get the exact same error thrown.

Hi Tim,

unfortunately, using Unity 20222.3.18f1 + visionOS plugin v1.0.3 on simulator still gives the same issues when trying to use the microphone on VisionPro:
here is the list of microphones devices available thru Unity’s Microphone.GetDeviceCaps():
1 microphone, name “iPhone audio input”, Min. frequency: 0, Max. frequency: 0

When calling Microphone.Start(), I get a
‘Starting microphone failed:“An error occured trying to initialize the recording device” (70)’
UnityEngine.Microphone:Start(String, Boolean, Int32, Int32)’

and returns a null AudioClip

They log also shows erros from visionOS source code:

In project settings, “Microphone Usage Description” field is filled.

So not sure the microphone works pretty much the same as it does in iOS or other platforms. FYI, the code I use to init the microphone works on iOS, Android and PSVR2…


1 Like

When running the Polyspatial examples (Character Walker/Mixed Reality), the Button text turns into magenta blocks once played in Unity and this issue persists on the device.

Converts into this:
Screenshot 2024-02-14 at 8.03.55 PM

The first thing I would try is to reimport the shader graphs that we use to render UI content (Packages/PolySpatial/Resources/Shaders, particularly TextSDFSmoothstep.shadergraph). If that doesn’t work, feel free to submit a bug report with a repro case and let us know the incident number (IN-#####).

Thanks for the help! I checked that folder and see this:

How do I reimport the right shader graph?

In Xcode I also get this:

[Diagnostics] Warning: Non shader graph shader 'Shader Graphs/TextSDFSmoothstep' not supported or MaterialX encoding missing

That’s the Assets/…/PolySpatial/Resources folder. You need to locate the Packages/Polyspatial/Resources/Shaders folder.

To reimport you can select TextSDFSmoothstep and right click → Reimport.


That was it! Thanks a ton.

Although for some reason after I re-imported everything, the logs stopped printing in Xcode.

EDIT: never mind, that was just an Xcode window view issue.