Show Passthrough camera feed in Virtual Reality (Fully Immersive) App

Is it possible to mix the stereo passthrough camera feed in when running in VR/Fully Immersive App mode? I have scoured the documentation for this but can’t find any references on how to do it in Unity.

The concerts in AmazeVR do this and it’s a great effect.


Not at this time no, it’s something we’re exploring internally but currently the way we support fully immersive apps and the way we support mixed reality apps are separate.

Fully immersive apps use metal compositor services to render and mixed reality content (that shows the pass through video) use RealityKit to render.

1 Like

Thanks for the quick response @DanMillerU3D

1 Like

I highly recommend submitting feedback to apple using Feedback Assistant that mentions you want access to the camera texture like in iOS.


To be clear, it is not possible to combined passthrough video and fully immersive VR content at the same time. We are working on a way to switch between VR mode (CompositorServices) and MR mode (rendered with RealityKit) while the app is running. Even when this capability is available in Unity, passthrough video will not be available in VR mode with Metal rendering. This is a restriction imposed by the operating system. Whenever a CompositorLayer is in use, passthrough is disabled (with the exception of hands/arms and other people “poking through” the metal-rendered compositor layer).

You can see this restriction in effect if you build and run the Xcode visionOS App project template. When you switch the toggle to show the immersive space, passthrough is disabled.

I wrote on feeback to Apple as well, but what is the actual limitation? showing passthrough is what we do in IOS, with all our Metal rendering goodness. Does it just work completely different than ARKit stuff in IOS?

You’ll have to ask Apple for a definitive answer there. I’ve been told that the decision was made to protect user privacy, but I don’t see any official documentation to that effect.