What problem are you talking about? If itâs the pink text issue, please make sure youâre using the latest PolySpatial packages (2.0.0-pre.11) and that youâve tried reimporting the text shader graph.
Does it work in non-Unity visionOS apps? Itâs not clear to me from the documentation to that function that itâs meant to hide the hand gesture UI. It only mentions hiding the Home and SharePlay indicators. You might want to submit feedback to Apple via their Feedback Assistant that you would like a way to hide the hand gesture UI.
Edit: It looks like this needs to be set on the ImmersiveSpace. For example, I can edit MainApp/UnityVisionOSSettings.swift
in an Xcode project generated from Unity to add the function to the immersive space, making it look like this:
//
ImmersiveSpace(id: "Unbounded", for: UUID.self) { uuid in
PolySpatialContentViewWrapper(minSize: .init(1.000, 1.000, 1.000), maxSize: .init(1.000, 1.000, 1.000))
.environment(\.pslWindow, PolySpatialWindow(uuid.wrappedValue, "Unbounded", .init(1.000, 1.000, 1.000)))
.onImmersionChange() { oldContext, newContext in
PolySpatialWindowManagerAccess.onImmersionChange(oldContext.amount, newContext.amount)
}
KeyboardTextField().frame(width: 0, height: 0).modifier(LifeCycleHandlerModifier())
} defaultValue: { UUID() } .upperLimbVisibility(.automatic)
.immersionStyle(selection: .constant(.automatic), in: .automatic)
.persistentSystemOverlays(.hidden)
If I do that, then the menus are indeed hidden (although when you make the gesture, it pops up a little window telling you that theyâre hidden, which somewhat defeats the purpose). I believe we can add an option to add that function to immersive spaces as part of the build process; Iâll add that to the list for a future release.
Are you saying itâs just not possible to add post-processing directly to the video feed? Is it possible to do post-processing on Unity content with passthrough? like a glowing ball with Bloom enabled and then have that composited with the passthrough video?
Yes. Because Unity shaders canât read from the texture, we canât do post processing on it. You can still draw on top of the passthrough layer with transparency, which is alpha-blended onto the passthrough image, so some primitive film grain or other screen-space effects may still work.
Yes. You can still use post processing effects based on Unity-rendered content like bloom on a bright virtual object. You just canât apply a bloom filter to real-world lights, for example.
Perfect! I got the Metal Passthrough scene working but havenât gotten it working with PP yet. But incredibly relieved to hear that itâs theoretically possible.
Has anyone succeeded with this setup? I tried upgraded to pre.11, but I didnât find the reimport visionOS Package (visionOS/Resources/Shaders) selection. So I just reimported Installed TextMesh Pro in my Asset. The result on the AVP is, the text turns from pink box to black box.
In the asset folder tree under Project â Packages â PolySpatial, highlight the Resources folder, right click (control-click on macOS), Reimport (or specifically right click and reimport TextSDFSmoothstep.shadergraph in that folder).
We replace the standard TextMeshPro shaders with this shader at runtime, so just reimporting the TMP shaders wonât work.
Thanks for your reply. I couldnât view the experimental Package content in Unity. After clicked reimport of the PolySpatial Package, there is no progress bar appeared. Am I doing right with this part?
Hi there! I have a project that is setup as a Reality kit with Polyspatial and during my VR experience the user is supposed to traverse across an area. To reduce the motion sickness effect, I want to transition the immersive view into a windowed view using pass through, similar to how turning the digital crown shrinks down the immersive space.
With Polyspatial the passthrough texture is visible when there is nothing rendered, however, I want to keep every game object in place, and instead âmask offâ an area to render. I have tried various experiments with making shaders that clear the frame buffer for that area using Z-testing and Colormask 0, but none of these render properly in the Vision pro. I am using the latest 2.0 release of both VisionOS and Polyspatial.
What would be the best way from going from a fully immersive space like this to a windowed space and then back again, in the most smooth and elegant way possible? My other thought was to switch to Metal and then have the user interact with the âDigital Crownâ to turn down the immersive space, then check the ImmersiveChanged call back to trigger my camera move once they control the level past a certain point, but I would like to avoid user interaction, and also the idea of needing to build through Xcode every time I need to preview my scene sounds awful.
Thanks,
Ben
Thank you for the update on the 2.0 pre-release packages. The new features, especially the enhancements to RealityKit with PolySpatial and the introduction of the Hybrid app mode, sound exciting. I appreciate the clarity in the updated documentation and look forward to exploring these new capabilities.
That should work if the contents of the package are there, but yes, you should see a progress bar (Importing/Compiling Shader), and importantly, you should be able to see the package contents (including the Resources folder). It seems like something may have gone wrong with installing the PolySpatial package(s). Maybe try removing them and reinstalling?
Thatâs great, many thanks⌠I have a question, maybe itâs about another topic, but itâs related to ImmersiveSpace
and this same Swift file ( MainApp/UnityVisionOSSettings.swift
), by editing this same file or some other for the AVP, is there a way to the keyboard appears always in front of the user (no matters the camera position), when selecting an input text field?
Are there any work arounds for getting the scene view to look like the play mode in the game view when using the Polyspatial lighting node? I would say my two biggest gripes that make it difficult to develop for apple vision pro are:
- You have to restart the Play to device app on the headset every time you remove it
- If you are using unity scene lighting, and need to use the Polyspatial Lighting node to apply lightmaps etc, you can not see how your scene looks until you press play.
Can we not just take the performance hit on our development machine and constantly feed that Polyspatial node with lighting data?
Thanks
Sorry, Iâm not aware of any way to do this. However, if you do find a way to do this in SwiftUI, please let us know, because we would love to incorporate it into the UnityVisionOSSettings.swift
file that we generate. AFAIK, this is on Appleâs end, so perhaps filing a feedback item with them about it would help.
Yes, thatâs something weâre aware of. It happens because PolySpatial in general doesnât run outside of play mode/builds, so weâre not passing the required properties to the shader. The likely fix (now that we have the ability to add extension settings, like the Unlit Tone Mapping option, is to switch from the PolySpatial Lighting node to using a Lit target with an extension option to use Unity lighting (light maps, etc.) in the MaterialX output. The PolySpatial Lighting functionality has been on the back burner since visionOS 2.0 adds its own support for dynamic lighting, but I can see why this would still be important for light maps, etc. Anyway, just letting you know that itâs still on our radar.
Have you submitted a bug report for this? This should not be the case. Is the issue that the app doesnât accept new connections, or that it breaks existing connections, or both?
Thanks for your reply, Iâve been doing some tests but still without success, sometimes it would appear slightly to one side at the bottom at a similar position of X = -0.3, Y = 0.5, Z = 0.01 (mĂĄs o menos) but I donât think that was because of what I was trying to do, what we do know is that in Bounded Volume it does appear great.
Another solution that I know works is to recentering the app with the Digital Crown Button, after selecting an input field it puts the keyboard in the center of the screen, I also tries to simulate that recenter through the code but it didnât work for me, so at the moment a small legend appears with that indication to recent the keyboard with the Digital Crown Button in our H4TCH App.