šŸ“Œ Feature Highlights for PolySpatial 2.0 Pre-Release

Highlights for PolySpatial 2.0 Pre-Release

The latest 2.0 pre-release packages are now available, containing numerous features that help you create more immersive experiences and achieve greater visual diversity. The short video above shows off several of these new features ā€“ we encourage you to take a watch!

App Mode Changes

Due to changes in OS capabilities, our App Mode nomenclature has changed as follows:

  • Virtual Reality - Fully Immersive Space is now called Metal Rendering with Compositor Services
  • Mixed Reality - Volume or Immersive Space is now called RealityKit with PolySpatial
  • Windowed - 2D Window remains unchanged

In addition, this release introduces a new Hybrid app mode, which allows fully immersive experiences to toggle dynamically between Metal and RealityKit-based rendering at runtime.

To help you navigate these development pathways, weā€™ve improved clarity across our documentation to better describe these distinct app modes.

New Features for PolySpatial 2.0

The following new features are available for RealityKit with PolySpatial apps:

  • Stereo render targets which allow developers to stereo render content for RealityKit using the full Unity feature set to achieve effects like portals and floating holograms.
  • Multiple volumes and volume cameras for RealityKit-based apps to enable new multitasking capabilities as well as multiple simultaneous scene views (maps, inventories, UI, etc.).
  • Blendshapes that enable richer, more expressive characters and a wider range of geometric applications, including deformations and natural-looking animations.
  • DOTS Entities graphics to bring full DOTS support to visionOS and enable more diverse visual experiences.
  • Networked AR Data, which brings support for AR input sources like planes, images, and meshing to Play to Device
  • Shader debugging to help you create and customize shaders more efficiently.
  • Bake to Texture Particles, a new particle replication mode with improved runtime performance (but higher runtime shader compilation cost).
  • Text quality improvements for TextMeshPro taking advantage of newly available shader capabilities, such as the derivative node.
  • Hardware support for 3D textures, cubemaps, and texture arrays which improve reliability and robustness for these asset types, which were previously supported through software emulation.

New Features for visionOS 2

Metal Rendering:

  • Rendering Metal with passthrough to build content with passthrough while leveraging Unity graphics features that are compatible with Metal 17 out of the box.
  • ARKit Environment Probes provide a cubemap of the real-world environment for reflective surfaces in apps using Metal rendering.

RealityKit with PolySpatial:

  • Resizable volumes that help you achieve greater control over the scale within bounded volumes.
  • Dynamic lights and shadows for greater immersion and realism in Mixed Reality experiences.
  • Custom hover effects for greater visual control of hover interactions when a user is gazing at objects.
  • Improved support for shaders that enable a greater range of visual effects including a ā€œfrosted glass materialā€ that matches the transparency effect of visionOS system windows, as well as depth write, depth test, blending mode, and render face properties.
  • Billboard component that enables content that is always oriented towards the viewer for improved visibility and readability.

All modes except Windowed Apps:

  • Immersion Changed API which provides a callback for when users rotate the digital crown dial to change the amount of immersion for progressive mode apps.

Additional package samples, including an additional scene for hover effect customization, have also been added to demonstrate these features.

Guidelines for Pre-release Packages:

While pre-release packages have stable features and APIs, there is ongoing work focused on performance improvements and overall stabilization. As such, users should not be pre-release packages for production. Production projects should use our latest 1.x packages which will continue to receive additional performance improvements and bug fixes.

For more details on the latest changes, please refer to the pinned documentation and detailed release notes.

10 Likes

Thanks for the update!
If Xcode 16 beta is needed for Polyspatial 2.0, that means that none of the features are available for the current version of Vision OS and can only be shipped when Vision OS 2 releases at the end of the year, correct?

Hey Luis, we recommend users use our 1.x packages for production since 2.x packages are still in pre-release alongside visionOS 2, which is in beta.

The packages should come out of pre-release around the same time frame when Unity 6 is fully released, hopefully along the same time frame when visionOS 2 is released as well.

In this update, can we put the shader into assetbundle (not including prebuild app)?

3 Likes

I have a box collider on an object and a script alongside it that implements IPointerClickHandler, IPointerDownHandler etc. In Polyspatial v1.2.3 these events would be fired if the object was interacted with. On v1.3.1, Iā€™m not getting anything

Build Error in Xcode:/Users/visionOSTemplate-2.0.0-pre.9/Build/MainApp/UnityVisionOSSettings.swift:45:52 Contextual closure type ā€˜(ImmersionUpdateContext) ā†’ Voidā€™ expects 1 argument, but 2 were used in closure body

I just upgraded to PolySpatial 2.0 Pre-9. When I build the content packaged with Unity6, I get an error ā€œB(l) ARM64 branch out of range (-136260476 max is +/-128MB)ā€ in Xcode16 beta 4. I packaged the sample scene of PolySpatial. Before, everything was fine when I used Pre-3 version and Xcode 15 to package. After upgrading to Pre-9 and Xcode16, this error is always reported.

After updating to 1.3.1, now itā€™s spamming an error that it canā€™t access the LeftHand mesh (from the XR Hands package) when converting to PolySpatial mesh and that it has isReadable = false, setting it writable and setting the game object with the mesh inactive didnā€™t help, had to remove it completely from the scene

PolySpatial2.0-Pre9 sample runtime crash in AVP.

So far i have been unable to get Passthrough using Metal for Rendering working on the Device, instead i just see the Camera Background Color (Event tho i have set Opacity to 0 as mentioned in the samples)
Its also not working when building the samples themselves.

Tested it for 6000.0.12f1 and 6000.0.6f1.
Flow was:
Create new URP Project
Install Package and Import the Samples from the Package
Disable Rendergraph as mentioned inside the Notes of this post
Make sure Metal is selected inside the Plugin and that the Mode was set to Mixed/Automatic.
Build

Hey there! This should be fixed by checking IL2CPP Large Exe Workaround box in Project Settings ā†’ XR Plug-in Management ā†’ Apple visionOS:

Screenshot 2024-07-26 at 9.39.47 AM

Can you try enabling that and rebuilding?

Cheers!

Hey there!

There are several known issues that could result in crashes listed under the Known Issues section of the 2.x release notes along with workarounds; could you take a look to see if any of these apply to your scenario? Also, are you on the latest Xcode 16 beta 4 release? Using previous Xcode 16 beta versions are not compatible with PolySpatial 2.0.0-pre.9 packages and could result in crashes like the one youā€™re facing.

If youā€™re still having issues and seeing crashes, would you mind filing a ticket?

Cheers!

In addition to the above, you can also try Product -> Clean Build Folder to clear Xcodeā€™s cache. If that still doesnā€™t work, please also try either building to a new build folder or deleting the build folder and then re-build it again. Iā€™ve found that sometimes Xcode holds on to the previous versionsā€™ symbols and needs a refresh.

Hello, thanks for the support. I need to get TEMPLATE 2.0.0 working because my application is based on it, which already comes with many things installed. However, I have followed the instructions to the letter and I canā€™t get it to work. Is there any detailed tutorial for dummies? I would appreciate any comments. Thanks a lot.
.

Hey There; it seems you are having issues making a build. Are you on the latest beta of XCode (Beta 4), also you should select build instead of build and run. Hopefully updating to XCode Beta 4 fixes your issue

[stereo render target sample missing]
Hi everyone, when I update the package ā€œPolySpatialā€ to 2.0.0-pre.9 and import samples (Unity PolySpatial Samples 121.68MB) from this package, the button ā€œImportā€ never changed to ā€œReimportā€.

And in folder Project/Assets/Samples/PolySpatial, there are 16 sample folders but no Stereo Samples. Inside Project/Packages/PolySpatial and PolySpatial Extensions these two folders, there is also no Stereo Samples.

As official manual: Stereo Render Targets | PolySpatial visionOS | 2.0.0-pre.9
mentioned,
Samples/StereoRenderer/Settings/StereoRendererURPAsset and
PolySpatial Extensions/Samples/StereoRenderer/Scenes are exist. But I couldnā€™t even find Samples/StereoRenderer and PolySpatial Extensions/Samples in the official template project (visionOS Project Template - Google Drive).

Could anyone know what Iā€™m missing, thanks ahead for your help.

What I already checked:
1, Unity: 6000.0.12f1
2, PolySpatial 2.0.0-pre.9
3, PolySpatial Extensions 2.0.0-pre.9
4, PolySpatial visionOS 2.0.0-pre.9
5, PolySpatial XR 2.0.0-pre.9
6, Apple visionOS XR Plugin 2.0.0-pre.9

1 Like

Hi,

Can you post the entire error message? Usually the true problem is buried deeper in the error message and the ā€œBurst compiler failed runningā€ is a red herring.

Some things to try - Iā€™ve seen that on occasion when I update Xcode versions. Usually opening up Xcode and agreeing to licensing will fix this issue. You can also open up the command line and use the command xcode-select -s <Path_To_Your_Xcode> to ensure that the Xcode command line tools have been set up.

Besides that, if you donā€™t need burst, you can also disable burst compilation by going to Project Settings -> Burst AOT Settings -> Enable Burst Compilation and ticking that to disabled.

Hope that helps!

Is there any support for full screen / post processing FX for passthrough?

Thank you guys for the great work! Iā€™m building the InputSystem UI scene from the ā€œApple visionOS XR Pluginā€ sample. However, I couldnā€™t set it up to get the passthrough or press any button. Iā€™m I missing anything in the setup? Iā€™m using Unity 6000.0.12f1 and Xcode 16 beta 4

No. This is not possible on visionOS since we donā€™t have access to the passthrough texture. All we can do is write color with alpha = 0 and the passthrough video will show underneath what Unity is rendering. The system composites our rendered frame with the passthrough video in a layer outside of the application.