VisionOS 2.0 support, new features and general integration plan


After the WWDC2024 announcement yesterday, and probably more coming today/tomorrow, I was wondering about your general plan/timeline for VisionOS 2.0 support as well as integration of some new announced features:

  • ARKit: ObjecTracking, RoomTracking, barcode scanning, etc.
  • RealityKit: discrete lights (finally), blendshapes,
  • New Compositor API supporting MR mode: swapping that in place of the rkit wrapper ?,

Will there be a chance to test that earlier with some beta package or will we need to wait the official VisionOS release to get an official unity package? (In Sept I suppose).
After the official release, what can we expect in term of minimum supported OS version for this platform?

Thanks in advance for any clarification.



Would love to know this too as I’d like to upgrade to the beta, but first I need to know if it’s possible to develop with it using the latest release (1.2.3).

Given this talk Render Metal with passthrough in visionOS - WWDC24 - Videos - Apple Developer I’m excited to see if it’s now possible to make MR experiences without PolySpatial.


Me too.
Demo is open, you can try it now!


1 Like

Thanks for the link! Video is now live, having looked at it and the sample code, I think we’re good to go. Your move Unity!


Hey all! We’re as excited as you are, and our teams are hard at work on the next release. As usual, we look forward to providing compatibility with the latest version of visionOS so stay tuned for more information soon.

Based on our testing, yes, there should be no problems with this; the beta is entirely backwards compatible as far as we know. Of course, PolySpatial 1.2.3 doesn’t support any of the new features; we will have to issue another release in order to support them.

Is anyone else getting build errors like this?
Some context: I jumped in with both feet. I got Unity 6, VisionOS 2.0, Xcode 16 beta, MacOS Sequoia. Besides this issue, everything seems smooth. I had this issue before but I can’t remember how it got fixed. Any advice will be greatly appreciated.

You may need to run xcode-select --install.

1 Like

Thanks. I got it working. I already had it installed but it was in the wrong location or something. I had to run this in the terminal which fixed it:
sudo xcode-select --switch /Applications/

1 Like

I watched the Render Metal with passthrough in visionOS session and while it looks promising, there’s still a significant gap with the PolySpatial path and that’s eye tracking. Where the user is looking isn’t exposed. There’s no way to show a hover affordance outside of SwiftUI and RealityKit/PolySpatial.

This is confirmed by a Vision Pro Engineer on the Apple developer forum here

For the CompositorLayer you can use the onSpatialEvent callback which provides the pinch and gaze when the pinch began

Render Metal with passthrough in visionOS session is going to be more applicable to passthrough without polyspatial. Gaze is limited to after a pinch in Full VR.

As for gaze in polyspatial aka Mixed (which is built on top of RealityKit). For VisionOS 2.0, there are a couple of new options from a node in materialX’s ShaderGraph that will enable custom highlight.


In case you haven’t already seen this post! 📌 Upcoming Plans to Support visionOS 2 We’re super excited for the next couple of weeks!

1 Like

Indeed, there should not be any issues with this based on our testing; to the best of our knowledge, the beta is fully backwards compatible. Naturally, none of the new features are supported by PolySpatial 1.2.3; a new version will be required to enable them.

niiice are u able to get visionOS 2 features working on unity 6?

Im not sure what new 2.0 features there are, but the build still works pretty much the same.

1 Like