📌 visionOS Template Update

We’ve updated the visionOS template with an unbounded (Immersive space - Mixed Reality) scene that shows off a wide range of functionality around ARKit enabled features alongside the XR interaction toolkit.

The goal of the template is to show off how to set up and combine a variety of features available on visionOS and provide a great starting project for anyone building something for visionOS with PolySpatial.

Here’s a preview video showing the full template starting from the bounded scene.

There’s a spatial UI onboarding prompt that introduces you to the scene and a persistent menu that allows you to configure the objects and return to the bounded scene.

ARKit planes have a custom shader along with colliders. There are rigidbodies tracked to the user’s hand allowing you to tap any horizontal surface to place an object.

The placeable totems are complex interactables with a guide that snaps to placeable surfaces. The physics objects are simple interactables that will interact with any found plane.

All of these features can be found in the template project available here. Note it has the same name as the previous template since it works with the latest 1.0.3 releaes packages.

We hope you find this useful and are open to feedback for future updates.


Looks amazing, thanks!

1 Like

Hey congratulation on release!
just wanted to let you know that it seems like the blue particles( trails of moved objects) doesn’t seem to render correctly in Simulator.

Most ARKit enabled features do not work in Simulator.

Correct; the Simulator doesn’t provide spatial Data. so its expected

Congrats on the new release!!
Just tried it out, tap to place content does not seem to work in unbounded scene

Am I missing something?

The collider is attached to a joint on the users hand. Try doing an open palm tap on the surface (not the spatial tap gesture).

1 Like

Thanks for the heads up, what particle system specifically or where is it located?

It works, thanks :slight_smile:


The template looks amazing. I noticed that some of the UI and 3D elements have baked diffuse lighting (or baked AO) which helps to make it look really awesome. However, for some reason, Unity only allows this type of baking on static objects which is inconvenient. I suppose the PolySpatial team used an external 3D authoring tool like Maya / 3DS MAX?
Is there any chance that Unity would make it possible to bake diffuse lighting into non static gameobjects?

1 Like

It looks like this is in a sample scene we ship from the hands package in the template but not the actually template scenes which are SampleScene and SampleSceneUnbounded. I’ll try to clean this up with the next release, thanks for flagging it!

Correct, the texture maps were authored outside of Unity.

It’s something we’ve been looking into when exploring visual equivalence between Unity and RealityKit but not something we currently support. You can submit an idea to the product roadmap for it.


When the new SampleSceneUnbounded scene is opened in the Unity Editor with the platform set to VisionOS, there are quite a few validation errors related to UI layout items.

While they don’t seem to have any effect on being able to run on device or in the simulator, what is considered best practice to handle these type of items? Should "Fix All " be used which basically just disables the offending Components or should these offending Components be removed from the Game Objects altogether? Or should they be replaced with something else?

Looks super interesting. We don’t own an AVP yet but I’ve read up on it. Could someone maybe help me understand what’s happening here?

  • The “Expand View” button seems to switch from bounded to unbounded mode, correct? So there could have been other bounded volumes in the user’s space, and they would have become invisible when switching to unbounded? (Or do they get killed?)
  • I get that there’s no hand/head tracking in shared mode, so does this become available when switching to unbounded? Is this how the place-objects gesture is recognized, by tracking the position of the hand? Would it be possible to directly touch the objects in unbounded mode as well?


Hey when i deploy template project with apple pro vision i get this trial version? I’m wondering if this is due to having a trial version of unity pro? I have never had anything like this happen before with unity, ive had oculus headsets all the way back to the samsung gear vr seems super odd


Hey thanks for pointing these out, a lot of these validation rules are no longer valid and can be ignored. I’ll see about removing them in the next package release.

1 Like

Correct, the OS fades out all content in the shared space (bounded apps) when switching to unbounded (immersive space). When switching back to bounded mode the shared space apps will come back in the same position that they were previously in.

Correct, in unbounded mode we enable world sensing permisisons and hack tracking permissions the enable device (head) position and ARKit hands (hand tracking).

Yes, direct pinch is also enabled in unbounded mode. We leverage the spatial tap gesture and filter input to allow indirect and direct pinch for al interactables

1 Like

That is odd, I haven’t seen this personally. Can you try relaunching the project and signing in and out of the hub. The video makes it look like the PolySpatial packages aren’t properly registered.

1 Like

I did try that, im not sure because the trial is assigned within a org and not sure is the project is registered. I also i tried uninstall and reinstalling the project both the templates and unity hub. I think its something within the account that is sending the wrong signal

my email is sljward@outlook.com with my account

Hey, I tried three different licenses and two different users with different licenses and nothing would work. I ended up purchasing it to test and it still would not work. I’m going to set up a different profile in Mac to see if that works and then possibly buy a new Mac

I’ll let me know if there might be any other solutions