Is it possible to mix the stereo passthrough camera feed in when running in VR/Fully Immersive App mode? I have scoured the documentation for this but can’t find any references on how to do it in Unity.
The concerts in AmazeVR do this and it’s a great effect.
Not at this time no, it’s something we’re exploring internally but currently the way we support fully immersive apps and the way we support mixed reality apps are separate.
Fully immersive apps use metal compositor services to render and mixed reality content (that shows the pass through video) use RealityKit to render.
To be clear, it is not possible to combined passthrough video and fully immersive VR content at the same time. We are working on a way to switch between VR mode (CompositorServices) and MR mode (rendered with RealityKit) while the app is running. Even when this capability is available in Unity, passthrough video will not be available in VR mode with Metal rendering. This is a restriction imposed by the operating system. Whenever a CompositorLayer is in use, passthrough is disabled (with the exception of hands/arms and other people “poking through” the metal-rendered compositor layer).
You can see this restriction in effect if you build and run the Xcode visionOS App project template. When you switch the toggle to show the immersive space, passthrough is disabled.
I wrote on feeback to Apple as well, but what is the actual limitation? showing passthrough is what we do in IOS, with all our Metal rendering goodness. Does it just work completely different than ARKit stuff in IOS?
You’ll have to ask Apple for a definitive answer there. I’ve been told that the decision was made to protect user privacy, but I don’t see any official documentation to that effect.
In this weeks Apple WWDC there was the announcement that passthrough will be available via metal rendering.
I am curious if that will just work without any further work on your end, or do we need to wait for you guys to update the polyspatial packages or whatnot before we can use it in our applications?
If so a rough ETA would be awesome
Hi, I am currently obtaining a clearly not intentional “passthrough” effect when I render a second camera on top of the normal “XR Origin” camera.
I am on VisionOS 2. I am using Unity 2022.3.51f1, so no Unity 6 yet. The camera is an absolutely normal Unity camera game object, that was used for another unrelated non-XR feature and was left it in the XR scene by chance. But instead of showing what it renders (which I can see in the editor in play mode) that rectangle area shows me the real world when I build and deploy on the Vision Pro…
The camera object has all default settings (even the skybox is set to be the default Unity one, and it is shown correctly in the main XR camera). Other settings are the usual… 60 fov, culling mask is default except for one custom layer removed (only because of the afore-mentioned feature, most of the scene is in default layer anyways). The only different parameter is the priority set to “2” to render it on top of the main camera.
Also URP settings should be untouched from the default ones.
Now, I know this is not the most accurate description. But I would imagine that someone already must have already found out this behavior and I am wondering, since the passthrough data is apparently there even in fully immersive mode, it there were any new hopes for passthrough in fully immersive mode even with Unity 5…
This sounds reminiscent of the issue described in this thread, and if I had to guess, the issue is probably with the alpha value being written to the frame buffer for the overlay camera. What happens if you change the overlay’s background to a solid color with alpha = 255 or to Uninitialized?
I skimmed through the thread you linked and the problems might be related (you can definitely tell better than me), but I am not sure my problem has anything to do with shaders. My setup is literally 2 cameras (if I am not missing something in my scene).
Let’s see if I find the time to reproduce this on a simplified scene, but in the meantime I am attaching screenshots of what I see in my scene. I think the screenshot is taken from only one eye because when you visualize it with the headset I see the overlaid camera “doubled” (like when you put an object very close to your eyes), and each eye sees different camera footage to have 3D perception.
The green-ish border you see is the solid color I used. Keep in mind that I don’t have anything special set in the camera, you can check the last picture for how it looks in the editor.
Just add some more info to my confused report. I went to Edit → Project Settings → XR Plugin-In Management → Project Validation. There it told me that I was missing the depth perception component on my camera, I added it and now even the overlaid camera is rendering the scene just like the main XR camera (so there is no passthrough anymore). RIP to my passthrough hopes but at least now I am obtaining some more expected outcomes XD
I had passthrough working in Unity 2022.3.56f1 with VR- Fully Immersive Mode by simply having the clearflags of the camera set to solid color with color = (0,0,0,0). This was for built-in render pipeline though. Now I’ve switched to URP to get better resolution, and it has broken the passthrough effect. Would be interested to know if anyone has figured it out.