Unity as a library on VisionOS

I tried several ways to go around the PolySpatialSceneDelegate.sceneDidDisconnect issues (including having the scene delegate set to a different class, and selectively proxying only unity related scenes to the PolySpatialSceneDelegate) - it always ends up with the force unwrapping nil sooner or later.
In current state, this makes it unusable in most scenarios (any time we want to dismiss a window or an immersive space from swift), to be used as a library along with SwiftUI.

Any ideas if these issues will be addressed before the Vision Pro release? Or is there a chance that the PolySpatialRealityKit can be made public (vs compiled in a library)?

Thank you! I was able to fix our issue using your workaround. I can confirm that it seems like it doesn’t break anything.

We’re not facing the issue with PolySpatialSceneDelegate.sceneDidDisconnect at all. I am able to close our SwiftUI windows without any errors.

1 Like

In case it’ll help others, I managed to go around the sceneDidDisconect crash by passing the PolySpatialRealityKit.PolySpatialSceneDelegate.self as the scene delegate (inside configurationForConnecting:) only when we open the unity window (this meant setting a global var somewhere before calling openWindow on the unity window)

2 Likes

That’s basically what we’re doing, too.

Most things are working but there seems to be a strange issue where a sceneDidDisconnect crash occurs when you’ve switched to an immersive space (unbounded mode) and back to volume (bounded mode) at runtime and close the volume using the os controls/dismissWindow OR switching to immersive space again. When just closing the volume without having switched to an immersive space at runtime everything works completely fine.
image


As everything else works as expected would it be possible to add a check before unwrapping that value @mtschoen @v_vuk? At least from what we can see from the error message we hope a solution would be quite straight forward and it would at least prevent a crash even if there might be other side effects. As we’re recreating the whole scene/volume when launching our Unity part the next time we’re thinking that this approach might be enough in our case. Thanks!

1 Like

@mtschoen We are also interested to get access to this feature (company, industry), extremely important and relevant for us.

Relevance:

  • access to native 2D UI, operate entirely in the swift layer (same as uual ios/android).
  • access to the app shell,
  • easier access to native capabilities (e.g. SharePlay, any other framework/kit),
  • familiarity/easy of us/similarity to iOS/Android UAAL (re-usability code, re-usability mechanism, etc.),

@timonweide-sap : were you finally able to get a demo running end to end ?

other people in the thread: uaal is currently a topic on the roadmap, more voting on this feature will be great :wink:
https://portal.productboard.com/unity/77-unity-visionos-roadmap/tabs/272-mr-immersive-features

5 Likes

@Raphael-PTC yes, we’ve been able to get a full blown demo using SwiftUI and Unity up and running. There are some hickups left and there’s definitely some work to do, but in general all four points you mentioned are kind of possible, although they might require additional work or concepts as visionOS is obviously significantly different from iOS when in comes to some important concepts (volumes, spaces etc.).

1 Like

Thanks @timonweide-sap , this is great to hear !

1 Like

I got this working in a uaal wapper project. Its pretty straight forward and as mentioned the IOS workflow gets you 90% there.
Besides adding the static library libPolySpatial_xrsimulator.a
If you are still getting import PolySpatialRealityKit import error when compiling your wrapper target, check in the wrapper Build Settings for the following search paths “Import Paths”, “Header Search Paths”, “Library Search Paths” each have a reference to /PathToUnityBuiltProjectDir/Unity-VisionOS/Libraries/com.unity.polyspatial.visionos/Plugins

Also I had to create a reference to the Data folder in my wrapper project and enable the Target Membership.

The only other thing to watch out for is the Frameworks > UnityFramework in the wrapper project is not in red (incorrectly reference) it it is repoint the reference.

It would be very useful if our Unity friends soon added an walkthrough uaal build / setup in Xcode in the polyspatial docs. Anyway im sure its all incoming :slight_smile:

Hope this helps

Hey Guys,

Hit another snag on my uaal VisionOS journey. EXC_BAD_ACCESS on scene mount.
Note I was sucessful running the Unity framwork / scene via uaal with a simple project i.e. with no other Library linked dependancies.
Moving on to a project with other deps etc i.e. I have FlatBuffers dep from another library. I run into trouble on scene init

unityFramework.runEmbedded(withArgc: Int32(args.count), argv: argv, appLaunchOpts: **nil**)

The last logs before bail out are:

WARNING: RGBA Compressed ASTC6X6 sRGB format is not supported, decompressing texture

The call stack halts on a null pointer _storage FlatBuffers.ByteBuffer.Storage
Could linking FlatBuffers in my wrapper project be conflicting with the Unity Framwork? If so whats the recommended workaround?

Any ideas would be apprecaited!

Anyone got UAAL working on 0.71?
Starting with a simple/vanilla App “Poly Test” wrapper (i.e. no other deps / code other than imported untweaked Unity built scene bootstrap files)

Im able to sucessfully build, but hitting an early runtime nil on PolySpatialRealityKitAccess.getApiData()

Note I added “Data” to the Poly Test App

Any ideas would be appreciated

Same error here when using 0.7.1. I double checked if there are any new project settings or other changes that might me related to that issue, but I haven’t been able to find something to fix this.

Did anybody find a solution for that issue?

1 Like

I have created IN-65644 which describes that issue and provides a reproduction project.

For us this is a release blocker at the moment. It would be great if you can have a look at that @v_vuk @mtschoen @IsaacsUnity

1 Like

Our use case is to have all of our UI in mixed reality SwiftUI.
We’ve run into issues with Reality Composer Pro and are investigating using Unity, but would like to keep our SwiftUI and use what we have already spent months developing.

So lets say we make a WindowGroup with

    var body: some Scene {
        mainScene
        WindowGroup(id: "SwiftUICanBeNice") {
            ANiceView()
        }.windowStyle(.plain)
    }

Our initial thought was to mess with PolySpatialSceneDelegate, by overriding public func scene from PolySpatialRealityKit but I’d rather not mess with that or UnityPolySpatialAppDelegate.

What would be the best place to call openWindow(id: "SwiftUICanBeNice")?

Anyone have any ideas?

If you want your SwiftUI window to be the default window that shows up when the app launches just move the WindowGroup to the top of the scene. If you want to open the window at runtime from Unity having a look at how communication in Unity as a Library on iOS works. You can use the same options on visionOS. Unity - Manual: Building plug-ins for iOS should be a good starting point as I don’t think there‘s another way to open a non-Unity window from C#.

Hi folks, just wanted to chime in and make sure everyone is aware that Unity as a Library is not yet a supported feature when using PolySpatial to develop immersive MR apps for visionOS, nor has it been extensively tested.

There is some work to do, and clearly some issues to look at, in order to make this happen. Our roadmap is the source of truth for what’s coming next, and will be updated once support for Unity as a Library is baked into our development plans. Several of you also mentioned support for Swift UI, which we’re looking into as well.

As for the latest 1.0 release, you should not expect issues related to Unity as a Library to be fixed.

Thank you for clarifying! Our assumption aligns with your explanation, and we will proceed to implement the workaround.

I totally get this. But since Unity’s own UI features on visionOS are somewhat “suboptimal” (compared to native SwiftUI), this eventually got us to the decision to go native for now - although it means a much steeper learning curve. We’d still love to use Unity for the 3D rendering parts, though, especially when it gets to VR elements. I personally think the combination of SwiftUI and Unity should be a top priority.

3 Likes

Any news about this, as the feature seems to be highly requested on the roadmap with more than 20 requests?

We also believe that it is a basic feature that should be a top priority and is a blocker issue for many of us.

2 Likes

cross-posting, also discussed here: Use Unity as a library in visionOS apps - #7 by Raphael-PTC

+1 for also getting added and the blocking issue resolved.

Agreed “going native” is the way we went. Communcation between UI/3D layers is handled elegantly in SwiftUI/ RealityKit and the API’s are well documented!
PS looks promising, but things need to get much more stable and further along in terms of support + documentation.