PolySpatial Video Component HLS Support?

PolySpatial Video Component uses AVPlayer, and AVPlayer supports HLS streaming playback. Will PolySpatial Video Component also be able to perform HLS streaming playback?

1 Like

Hi,

Currently, there are no plans to support HLS streaming playback via PolySpatialVideoPlayer component. The PolySpatialVideoPlayer should eventually be deprecated in favor of the normal Unity VideoPlayer component, with hopefully more support for what the normal Unity VideoPlayer already supports, but it’ll be limited to existing normal VideoPlayer features.

It’s very, very unfortunate that PolySpatialVideoComponent is being deprecated.

PolySpatialVideoComponent was able to play MV-HEVC (Spatial Video?) stereoscopically.
Additionally, Apple also announced stereoscopic HLS distribution of his MV-HEVC at his WWDC.

Will Unity VideoPlayer support MV-HEVC (Spatial Video) and even stereoscopic rendering in the future?
If there are no plans, we strongly request that you continue to support PolySpatialVideoComponent.

Hi,

When the swap happens, the backend should be kept intact - so it’ll continue to use AVPlayer as the driving force on RealityKit platform. The goal here is to maintain existing functionality, and make it easier to add additional video player functionality (like RenderTexture output/render mode) on PolySpatial, so the ability to play MV-HEVC should carry over when PolySpatialVideoPlayer is deprecated.

That being said, those additional video player features will be limited to what the normal Unity Video Player already supports - so there are no plans to add HLS support in PolySpatial unfortunately.

Sorry for the confusion - hopefully that clarifies things!

Hi,

Sorry to resurface an old topic, but just wanted to update you on this situation - PolySpatial supports both PolySpatialVideoComponent (now renamed to VisionOSVideoComponent) and Unity’s normal VideoPlayer component. Each has different advantages and disadvantages, as described in PolySpatial VideoPlayer documentation. Notably, the VisionOSVideoComponent still works by piloting an AVPlayer on the backend, so its functionality should not have changed.

I am confused about the naming - does this mean that MV-HEVC is supported in Polyspatial?

can PolySpatialVideoComponent support streaming or downloading the video from a URL? Or does it have to be in StreamingAssets folder already?

VisionOSVideoComponent (PolySpatialVideoComponent) does support MV-HEVC. There’s plans to make a demo to showcase this, but at any rate there should be more concrete announcements about it soon.

Streaming from an URL is not supported though with VisionOSVideoComponent, the video clip still needs to be in StreamingAssets folder.

1 Like

Hmm just tried mp4 and mov and for some reason they are not being imported properly when they are in the StreamingAssets/PolySpatialVideoClips folder …

They import properly in another folder though?

You’ll need to import the video clips in a non-StreamingAssets folder so you can access the video clip reference in the editor, and then copy and paste that video clip into StreamingAssets/VisionOSVideoClips so it can be accessed in the build. Please refer to the PolySpatial video player docs for more info.

Ah thanks for updating the docs

A limitation of the current system is that the clip must be manually copied into a ../StreamingAssets/VisionOSVideoClips/ folder to display and render properly on visionOS. This folder must be relative from the project folder - for example, the full folder path may be Assets/StreamingAssets/VisionOSVideoClips/ . Create this folder if it does not exist. Ensure that the clip is not just moved into this folder, but copied into it, so that there are two instances of it - one referenced by the VisionOSVideoComponent and one under the StreamingAssets folder. Refer to Special folder names and Streaming Assets for more information about the Unity StreamingAssets folder.

Is the video supposed to display as a spatial video (MV-HEVC) or simply a 2d video (?)

  1. On device it’s just a black windowed screen (even though it is on a game object render material in scene)
  2. In unity editor game view it plays as a 2d video on gameobject material

Are additional settings needed for the MV video to play on device?

On editor and on simulator, it’ll be 2D. MV-HEVC video’s won’t appear spatial until viewed in device. A black windowed screen usually means that VisionOSVideoComponent couldn’t find the video clip or that the clip isn’t set to PlayOnAwake - are there any error messages? Try another non-MV_HEVC clip as well, just to rule out any problems with the specific clip you are trying to use.

Is there a way to use multiple clips? It seems PlayOnAwake being a requirement prevents that from being usable?

Or, since PlayOnAwake is required for all clips, should these gameobjects be initiated all very far away to not be renderable or detected by audiolistener?

Also does it only work for unbounded scenes, or are bounded volumes supported as well ?

What do you mean by multiple clips? If you want to switch clips, then you can just assign a new clip to VisionOSVideoComponent.

PlayOnAwake just ensures that the clip plays as soon as the object is enabled and active, the equivalent of calling Play() on the video component at start of runtime.

You can set PlayOnAwake to false if you want finer control over video playback using the functions Play(), Stop(), Pause() to control playback as needed.

VisionOSVideoComponent should work for both unbounded and bounded volumes.

Is there a way to make the MV-HEVC video immersive using the VisionOSVideo component - similar to the option in the iOS Photos app

Also, is the 3D on MV-HEVC playback on polyspatial Vision Pro only supported form the center? If so, is it possible to simply have it replace as an overlay when playing back?

Any news on this ? Do you achieve to do it ?