Feature Highlights for PolySpatial 2.0 Pre-Release

Hi, I tried everything with the 2.0.0.pre-11 template and also tried with fresh 6000.0.15f1 URP project.
Followed all the settings for URP, still can’t get to work at all with Metal Rendering with Compositor Services.

I see nothing when I start with metal, but RealityKit works fine. When I switch in Hybrid Mode. I get similar errors as @Futurristic and more attached below.
SwitchMetal-Logs.rtf.zip (11.7 KB)

Unity 6000.0.15f1
PolySpatial 2.0.0.pre-11
Vision OS 2 Beta 6

Please let me know how can I see pure metal stuff with unity.

Update: Issue was with 6000.0.15f1. Fixed when opened the project with 6000.0.19f1.

Not at the moment. The issue there is the manner in which we would make some subset of the scene graph available to the the PortalComponent to represent the content to render in the portal. This has come up before; I’d suggest submitting your desire for support for the feature to our road map so that we can track interest in it.

You mentioned DOTS support here:

Which is super exciting! But I can’t seem to get it to work totally and it seems to be the volume camera causing some bugs? Steps to reproduce:

  1. Download the template
  2. Download this repo: EntityComponentSystemSamples-master
  3. Install Entities Graphics from Package Manager (should install Entities dependency)
  4. Copy EntitiesSamples from EntityComponentSystemSamples-master into visionOSTemplate-2.0.0-pre.11
  5. Recompile cproj scripts
  6. Switch Build Target to VisionOS
  7. Open scene '.../EntitiesSamples/Assets/HelloCube/1. MainThread/HelloCube_MainThread.unity'

Notice that at this point, it works as expected - you should see a spinning cube entity, can monitor components/systems normally. But now, add a Volume Camera. The entity will now disappear UNLESS you click the edit subscene button, suggesting that baked entities are just not visible(?) to the volume camera. Scenes with physics also end up with unexpected results, seems like the colliders are moving and affected by physics, but the render of the entity is locked in place. Tried Play to Device and full Build, both reflect the image attached.

Am I on the right track with it being a Volume Camera support issue, or is it possibly a config/setting I’m missing?

Edit: Better photo
Showcasing effect in Play to Device:

1 Like

Wow, lots of debugging and trying to make it easily reproducible and I was just missing this page in the documentation haha

https://docs.unity3d.com/Packages/com.unity.polyspatial.visionos@2.0/manual/Extensions.html

Leaving this up in case someone else also missed it - you need to install package ID com.unity.polyspatial.extensions

2 Likes

Unable to get Metal to show anything on device:

Currently I am running the latest Unity beta, and all of the latest preview Polyspatial and Vision OS plugins. I am also on the Beta of xcode 16.1 and latest Vision OS 2 beta from Sep 4th. For the life of me I can not get anything to appear when setting my mode to Metal with compositor services. The app builds fine and it looks like it launches fine and deploys to the headset, but when the app is open I see nothing.

I have loaded the Vision OS template Pre 11 from the guide and loaded in the metal sample scene and built it in this vanilla state. I have also tried a multitude of settings changes to get it to work with no luck. It launches fine in every other mode except for Metal.

Also, I just wanted to point out that there is some conflicting information in the documentation regarding setting up a metal with compositor project. The sample scenes have a main camera object with Tracked Pose Driver component as the camera rig, yet there are still options to setup a volume camera with a configuration file that states “Metal”. Can we not just get clear documentation that says you need A, B, and C to get a metal scene running? Is the Volume Camera required for Metal with Compositor project?

Any advice here? I’m on a tight deadline and could use a lifesaver here :frowning:

Edit: I solved my issue. After 24 hours of painful testing. I finally decided to upgrade from 6000.0.14 to 6000.0.19 and my problem magically went away. Now I can get back to work.

Also, I’m just going to leave this here: I only subscribed to having a Pro license because of the Vision Pro features that have been intentionally paywalled, and I feel like it’s really sad to see that there are companies paywalling things that are completely beta and half working. I bet the development of Vision OS would move a lot more rapidly if you didn’t need to pay $2000 just to see how broken the software is.
I’ve used both the Polyspatial main release and the Pre11 release and both feel extremely rushed and hacky. I definitely see where the product is going, and I do give praise to the devs working on it, as it’s currently the best thing out there for developing for Apple Vision Pro, and I can clearly see the technical hurdles of interfacing with the complicated Apple ecosystem, but the price tag does not justify what is a basic, half broken SDK.

1 Like

I did manage to get the pass through with metal working, but was unable to get any post processing at the same time. It seems that the passthrough texture turns back when PP is turned on. How would you propose setting up a glowing ball with bloom that sits on a pass through backdrop? I can’t even get bloom to work properly with the passthrough off, however other PP effects seem to work fine.

Thanks.

I agree we can do more to clarify what’s going on here. The “Metal” mode on VolumeCamera exists to support the new (somewhat experimental) Hybrid Mode. This allows you to build a single application that can switch back and forth betwen Metal and RealityKit rendering (or combine the two with a Metal CompositorLayer and RealityKit Volume). You do not need a VolumeCamera for apps that only use Metal rendering (using Metal Rendering with Compositor Services App Mode). Even in Hybrid apps, you still need a regular camera and TrackedPoseDriver when Metal rendering is active. The VolumeCamera in Metal mode simply indicates to the system that we need a CompositorLayer to be active.

It sounds like you’re running into issues with alpha processing. Is Alpha Processing enabled on your URP settings asset? Note that you can safely ignore that warning about back-buffer format. We still need to update that one to be aware of visionOS.

We also implement a number of Project Validation rules for XR support on visionOS, which can be found under Project Settings > XR Plug-in Management > Project Validation. For example, in this project where I have Metal App Mode selected in 6000.0.0f1, I see a validation error telling me to disable HDR because it is not supported in this Unity version. Do you see anything here? Bear in mind you need to select the visionOS tab on the right if your editor is not already set to the visionOS build target.

If you already have Alpha Processing enabled or enabling it does not fix your issue, you may be seeing a bug that we haven’t encountered yet. On earlier versions, having HDR enabled (I think it’s enabled by default) would also cause this issue with post processing and alpha channel. It looks like you’re on the latest version of Unity and com.unity.xr.visionos, so that shouldn’t be an issue. If Alpha Rendering is enabled and you don’t see any issues in the visionOS tab under Project Settings > XR Plug-in Management > Project Validation, please submit a bug report with your project attached (even if it’s something trivial like a blank project with package samples imported). That way we can identify and address the specific issue that you’re facing. Finally, if you would like to keep this conversation going on Discussions, it will help to break out a new thread, since this can get easily lost in other general questions/comments about PolySpatial 2.0.

Thank you for your feedback. We are doing our best to prioritize the most critical issues and unblock developers who want to build with Unity on visionOS. It can be very frustrating when you’re faced with issues like a blank screen or major blockers like this, and I’m glad to hear that you were able to work through most of them. Especially when working with preview and pre-release versions it helps to always update to the latest in case we’ve already solved the bug you’re running into. Indeed, there have been a number of improvements in the past few months that improve how things work for visionOS on Unity 6 “out of the box.” And we’re still working on more.

For things that still don’t work out of the box or any common pitfalls, keep an eye on our FAQ page and Release Notes thread (especially the Known Issues sections). For visionOS, and as with other XR platforms, the Project Validation feature in the Editor is also essential for working out any issues specific to project settings or configurations that are known to cause issues. If you haven’t already, check out what you see under Project Settings > XR Plug-in Management > Project Validation within the visionOS platform tab. If you see any errors in there, you should be able to Fix All and get your project configured correctly.

Thanks again for reaching out and good luck! Be sure to spin up a new thread if you’re still having trouble.

Thanks so much for the reply, and I will continue these questions on another thread. I did manage to get it working in the end, but I still can not get HDR and Passthrough working at the same time. I tried your Alpha Processing option, but it seems it’s still not working. I will try and get some sample files later on, but if you have any suggestions, feel free to reply at the other thread. Thanks again and keep up the great work.

https://discussions.unity.com/t/does-hdr-work-with-passthrough-for-visionos2-metal/1521642

I upgraded my visionOS project from v1.3.9 to v2.0.4. I am able to build the polyspatial SpatialUI sample scene succesfully, but the app does not run. Attached are the versions I am using and screenshot references. Wondering if I am missing something for these latest version, I’ve tried restarting, and I’m following this manual.

Specifications
Unity: v6000.0.21f1
PolySpatial: v2.0.4
PolySpatial visionOS: v2.0.4
PolySpatial XR: v2.0.4
Xcode: v16.0
Apple visionOS app mode: RealityKit with PolySpatial
Vision Pro: v2.0

It looks like you need to switch to the “Unity-VisionOS” scheme (versus the currently selected “GameAssembly”) and build/run that. The fact that a random scheme is selected in the Xcode build is something we’re aware of, and hope to fix.

1 Like

2 Likes

I’m curious what, if anything, will be migrated back into the XR Interaction Toolkit package for general usage?

For a specific example, if one wanted to use the gaze+pinch interaction using indirect pinches on another platform, will there be built-in support for that eventually?

Thank you for the quick fix!

I have a follow up question - I am building a mixed reality app that can load different scenes, some of these scenes require rendering a 360 stereoscopic image as the environment.

I’ve been following this thread and found that the recommended way to do this would be to try out hybrid mode. I followed this manual to implement hybrid mode and I am able to run the PolySpatial SpatialUI sample scene with an unbounded volume camera, but when I try to load the Apple visionOS XR Plugin Metal Sample - URP scene with a metal volume camera the scene does not render.

Is there something that I am missing in my implementation of Hybrid Mode; or is there any easier way to render a 360 stereoscopic image on the visionOS?

dyld[934]: Symbol not found: _$s10RealityKit19EnvironmentResourceC0A10FoundationE15equirectangular8withNameACSo10CGImageRefa_SSSgtKcfC

When I upgraded my Unity to 6.0 and used the latest version 2.0pre11, I encountered this error after installing the application.

My VisionPro system version is the official release 2.0.

Are you using Xcode 16?

my xcode version is 16.0(16A242d).Additionally, my app runs normally on the simulator.
My Unity version is 6000.0.22f1, and I have upgraded the plugin to version 2.0.4, but I still encounter this issue, which prevents my application from launching.

dyld[2800]: Symbol not found: _$s10RealityKit19EnvironmentResourceC0A10FoundationE15equirectangular8withNameACSo10CGImageRefa_SSSgtKcfC
  Referenced from: <E5EDA158-B97D-3E69-978D-79A78AFC924E> /private/var/containers/Bundle/Application/DBFCCBB6-FFC6-43B1-9BC2-FFE0646DB287/MyProject.app/MyProject
  Expected in:     <F6C19A1D-3C71-3284-BAB4-8398EDDD393A> /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation