Stereo video player from side by side source in visionOS

I have a side by side video source coming from a remote stream. Is there a way that I can programmatically split that video stream and display each side on its corresponding eye on Polyspatial?

Yes, though you will have to use the standard Unity VideoPlayer, rather than the PolySpatial-specific VisionOSVideoComponent. The latter doesn’t support using custom (shader graph) materials, due to limitations in RealityKit.

If you use a VideoPlayer that renders to a RenderTexture, however, you can provide that RenderTexture to a shader graph and use the Eye Index node to alter the UV coordinates in order to present different content to each eye.

I believe there’s also support for some standard stereo formats through the VisionOSVideoComponent, too, though: you can refer to these previous threads for more information.

1 Like

Hi, little off the topic. May I ask what kind of video source is supported on vision pro?

If you’re using VideoPlayer to render to a RenderTexture, see this page in the Unity documentation. Whatever it says for iOS is likely to apply to visionOS as well.

If you’re using VisionOSVideoComponent, it’s whatever AVPlayer supports; I don’t have a list, though there’s an old one for macOS here. Most people seem to use Quicktime (“.mov”)/HEVC.