Thanks for answer fast …
Yest Xcode 16 Beta 4, and yes build only to open on xcode … but NO LETME BUILD …
Hm… from your previous screenshot, I’m surprised to see iOS
in the path for Library/Bee/artifacts/iOS/AsyncPluginsFromLinker
. Are you trying to build with an iOS build profile?
Not that I know of… everything should be for Vision only, I have searched for where to do it but I can’t find a solution.
Hi,
We’re looking into this - it seems like the Samples folder isn’t being treated as actual samples within the PolySpatial Extension package, and therefore not showing up in the editor. Within your project folder, could you try accessing <Project_Folder_Location>/Library/PackageCache/com.unity.polyspatial.extensions/Samples~/
?
That should have the location of the StereoRendererURPAsset
and StereoRenderer/Scenes
- simply drag the contents of Samples~/
into Assets/Samples
(make the folder if it doesn’t exist) in your editor’s Project tab to make them accessible to your editor.
Additionally, for these samples, please add layer 20 to Project Settings->Ignored Objects Layer Mask
. You may have to name layer 20 first before it can be added to the Ignored Objects Layer Mask
list.
It should look something like this:
Sorry for the inconvenience! I’ve notified my team and hopefully this’ll be sorted out by next release.
Hi,
In addition to the listed steps, you will need to go to your Universal Render Pipeline Asset
, and disable Quality->HDR
. I’ve notified my team about this, and this should hopefully be added as a project validation step on the next release, or called out in the documentation.
You can also switch the app mode from Metal -> Hybrid
and that should also allow the passthrough in the sample to work.
Hope that helps!
Hi,
Please refer to the post directly above for solutions on how to get passthrough working in the samples: 📌 visionOS Release Notes 1.3 and 2.0-pre.9 - #27 by vcheung-unity.
As for input, you may need to ensure your project has imported XR Interaction Toolkit (com.unity.xr.interaction.toolkit
) and XR Hands (com.unity.xr.hands
) to get input working. Ensure there are no missing scripts in the project as well - for me, without XR interaction Toolkit, the Main
scene was missing the XR Ray Interactor
component on the Primary Ray Interaction
game object.
Thanks a lot vcheung, it works as you guided!
As I name layer 20 as “IgnoreLayer”, it showed error when I click each button in the scene;
As I set layer 20 to Ignore Objects Layer Mask, same error appeared, and there were no balloon displayed at all;
please see the photos for details:
Bummer. I hope Apple will provide access down the road…
I updated the existing visionOS XR Plugin in my 2022.3.33f1 project to 2.0-pre.9, but I couldn’t see visionOS under XR Plug-in Management. I also got a “CS1503: Argument 1: cannot convert from ‘UnityEditor.XR.VisionOS.Analytics.VisionOSEditorAnalyticsEvent’ to ‘UnityEngine.Analytics.IAnalytic’” error. Are there any packages that need to be updated as well?
Which editor are you currently on? PolySpatial 2.0 is only compatible with Unity 6 Preview.
In the InputField (TMP) component, changing the Line Type property to Multi Line prevents the system keyboard from appearing. This bug still hasn’t been fixed in version 1.3.1.
Really glad to see the new release up with so many features !
In reference to the discussion here:
I found a really thorough description about pros/cons/drawbacks for combining Metal + RKit for pass-through (hybrid app):
https://docs.unity3d.com/Packages/com.unity.polyspatial.visionos@2.0/manual/PolySpatialHybridApps.html
Really great summary and technical details. However, I’m still not sure to understand what is working vs what is not working when you exclusively want to use metal pass-through (one unbounded metal volume), no RKit at all, no hybrid mode.
The documentation about using Metal only still reference VR mode and not the pass-through mode:
build fully immersive experiences for VisionOS, including virtual reality games or fully virtual environments.
(Metal-based Apps on visionOS | PolySpatial visionOS | 2.0.0-pre.9)
So I’m reformulating here previous ask from another thread:
- Can you confirm that you can use Polyspatial 2.0.9 + Unity 6.0 + VisionOS 2.0 beta and develop an unbounded volume pass-through metal-only app (no rkit at all) ?
- if so, what is working/what is not working in comparison of RKit mode? (Where can I find a page describing the feature limitation of Metal pass-though)
Thanks in advance.
I’m using Unity 6000.0.12f1. It works normally when I created a new project, but doesn’t work for me for existing Metal project
Hi,
This error is less a PolySpatial-specific one and more a general error - in order to switch between scenes (even within the editor) you’ll have to add these scenes to your Build settings, accessible by going to File -> Build Profile
. Switch to visionOS
platform if you haven’t already, Open Scene List
, and drag the scenes you want to access from to the Project view into that List.
The end result should look like this:
After that, you should be able to switch scenes during Play mode in the editor and in a build.
Hi,
For an app set to Metal Rendering with Compositor Services
app-mode, you will be able to set Metal Immersion Style
(also to be found in Project Settings -> XR Plug-in Management -> Apple visionOS
) to Automatic
, Mixed
, or Full
. Automatic
and Mixed
are functionally the same for Metal. PolySpatial
and RealityKit
will not be needed if an app is set to Metal app mode.
If you’ve set Metal Immersion Style
to Mixed
, you can still flip between pass-through and no pass-through by setting the Background Type
on your (regular) Camera
component.
Try importing the samples, particularly the InputSystem UI
or Main
scenes, in the com.unity.xr.visionos
package to get a better sense of what pass-through looks like in Metal app mode. You’ll need to fix some properties up to get that sample working (see above), but once it is, you’ll be able to toggle pass-through on and off at runtime.
The main downside of Metal app mode is the lack of native RealityKit rendering (i.e. getting a consistent look with other apps), inability to launch an app into visionPro’s Shared Spaces, inability to use volumes, and inability to use the PolySpatial debugging tools (PlayToDevice, Shader Debugging, Record and Play, etc). You also won’t be able to use the Progressive
Metal Immersion Style
, so you won’t be able to rotate the Digital Crown on visionPro to control immersion.
Hope this helps! I’m also planning on adding the information in this post to the Metal docs and adding some more details to the visionOS Platform Overview page, to make the process less opaque.
Thanks so much @vcheung-unity for all the technical details, exactly what I was looking for !
No more questions at this point, I will test and get back to you if I have additional requests
Thank you vcheung, problem solved! have a nice week.
I recently upgraded my VisionOS Unity polyspatial from 2.0.0-pre.3 to 2.0.0-pre.9! Sadly, after upgrading, all the textMeshPro elements are pink / missing shader when I build to the headset. It works fine in the editor. I am on unity 6000.0.10f1
things I’ve tried so far with no sucess:
- Adding the TMP shaders to the always included shader list
- Made a completely clean project with just polyspatial + samples and still had the issue.
- Clicking the “import TMP essentials” button in settings.
- Restarting Unity
Is this a known bug or is there a setup / setting I am missing? When I make a build I see these logs stripping shaders.
I also get this error sometimes: “Unable to update following assemblies:Packages/com.unity.polyspatial/Runtime/EditorAssembly/Unity.PolySpatial.Core.dll (Name = Unity.PolySpatial.Core, Error = 131) (Output: /var/folders/_b/b5v97l9d7m59pt1qwm19ntp40000gn/T/tmpd09b742.tmp)”
Thanks so much!
Have you tried specifically reimporting Packages/PolySpatial/Resources/Shaders/TextSDFSmoothstep.shadergraph (right-click, Reimport
)? That’s usually what causes this issue.
Yes! The associated shader sphere preview was pink and after reimport it resolved, but in build it is still pink sadly!