Switching between mixed and virtual reality while playing

Please tell me, am I correct in understanding this?

Unity PolySpatial Shared Mode correspond to visionOS volumetric windowhttps://developer.apple.com/documentation/SwiftUI/WindowStyle/volumetric.

Unity PolySpatial Exclusive Mode correspond to visionOS mixed ImmersiveSpacehttps://developer.apple.com/documentation/swiftui/immersionstyle/mixed.

With Unity PolySpatial:
Unity convert Unity’s object to visionOS’s RealityKit.
So there is a chance can transition between with PolySpatial
volumetric window<->mixed ImmersiveSpace
volumetric window<-> full ImmersiveSpace
mixed ImmersiveSpace<-> full ImmersiveSpace

But… ,Without PolySpatial:
Without Unity PolySpatial which mean we don’t use RealityKit in full ImmersiveSpace,
Unity use CompositorServices and render with Metal.

I think it’s hard that Unity can support RealityKit and CompositorServices same time.
So it is hard transition between without PolySpatial

Hi Dan has there been any consideration to support VR WITHIN polyspatial (using RealityKit components and the reality kit renderer). This would give us the functionality we want (eye gaze/hover component). Is there any way to render a skybox while in MR mode using polyspatial.

If you want me to submit this as feedback somewhere please let me know the best place to do so

1 Like

Hi @logansitu

Nothing prevents you from adding sufficient virtual content to obscure the physical scene - whether a skybox or otherwise. There’s a bit of a question about how to place virtual content with respect to the real world (if you want any intermingling at all) but I think all the tools you need for this are already available. But did you have particular Unity features in mind that would further help with this approach?

I’ll also note the “Stereo Render Targets” feature (slated for 2024)may prove valuable here, as it would allow you to render that skybox in real time using Unity’s full suite of rendering features. This would further augment the above approach allowing you to use your existing VR shaders, graphical effects, etc. rather than converting to shadergraph and working around feature limitations. Hope that helps!


on the product roadmap in the upper right there is a + Submit new idea button

You can place a inverted sphere around the user but it will have a visual affordance when it clips the real world (faded out).

I think support for the progressive immersive style that presents a skybox that the user controls with the digital crown could also be a potential solution,

1 Like

The switch to AR/MR mode is basically something we already do in mobile devices with AR foundation. Why isnt that a possibility here? (Even if it would still limit functionality like eye tracking data, it would allow us to use existing content).

So this is still not available right? I wonder if there could be some weird hacky work-around since apparently there’s the progressive immersion mode for the AVP that kind of supports this?

I’m also wondering how games like Synth Riders is achieving this effect. Since they do have both ‘VR’ and ‘MR’ modes. I wonder if they’re literally creating a giant inverted box with skybox texture for the VR mode

currently: somehow, blend shape animations are only supported in virtual mode

being able to transition between mixed mode and blend shape animation-possible virtual mode would be great!

mainly that it would still lack blend shape animations

how does one interface crown dial with polyspatial - is that reading available for us to fade our own skybox?

That currently isn’t supported with PolySpatial. visionOS enables this through the progressive style of immersive spaces, currently Unity supports full style (VR) and mixed style (unbounded mr).

I recommend submitting an idea to the product roadmap for this.

Is there any way to play stereoscopic 180/360 videos in mixed reality since the switch to fully immersive is not supported?

I’ll answer myself here and apologies if this is slightly off topic. To get a stereo 180 video working in immersive mode i did the following:

  • Get a stereo (SBS) video and convert it to MV-HVEC using a service like https://spatialgen.ai/ or Mike Swanson's Blog • Spatial Video Tool
  • Using the vision OS video component it can display this MV video in 3D
  • To project it properly in 180 create a sphere in Blender, invert normals and edit UV (UV unwrap modifier) halving the x scale