Playing 360 videos with the VideoPlayer

Hi everyone!

I finally took a bit of time to produce a simple 360 VideoPlayer example project with instructions.

For the Impatient

In 5.6, the VideoPlayer doesn’t offer workflow enhancers to streamline the usage of 360 videos so everything must be done by hand. There are a handful of steps to do manually which are all explained below with enough details for you to understand exactly what is happening and why.

What’s a 360 Video?

What we call a 360 video can take many forms. One of the most common forms is a movie with a 2:1 aspect ratio, often 3840x1920, that uses an equirectangular projection to unwrap a sphere into a rectangle, as described here: Equirectangular projection - Wikipedia

When looked at without putting them back into a sphere, these videos look deformed, especially at the top and bottom, corresponding to the sphere’s poles.

Creating a Projection Screen Inside Unity

As mentioned above, in order for the 360 video to look right, it must be viewed when applied to the inside of a sphere. This can be done in Unity but typically poses two - solvable - problems:

  • One cannot directly apply a texture to the inside of a sphere
  • Once overcome with some normal-inversion tricks, applying a texture inside a sphere leaves visible artifacts at the poles.

Both of these problems can be overcome with a relatively simple shader, see the details further down. So our projection “room” will be composed of a camera, sitting at the center of a sphere that uses this shader.

Looking Around

Since our projection screen wraps the camera completely, the camera can only look at a small portion of its inside surface at any given time. The demo project includes a script that lets you control the camera orientation using the mouse.

Future VR integration might automate this so one doesn’t have to add these inspection tools to the scene per se. This is conceptually similar to the Scale control at the top of the Game View.

Playing The Video

Once you have the camera, with its mouse control, and sphere-screen prepared, playing back a video in this scene is just a matter of importing a 360 video into Unity and drag-and-dropping the resulting Video Clip onto the sphere.

This drag-and-drop is a workflow helper that will automate for you:

  • The creation of the Video Player
  • The initialization of the Video Player’s Render Mode to “Material”
  • The selection of the _MainTex texture parameter in the first material of the sphere’s renderer as the target for the video playback.

Future VR integration in the Video Player might augment the offered Render Mode options to recognize that the video is 360 footage and add options for the interaction with the camera such as projection type, field of view, etc. It might also free you from having to create a sphere and render directly into the scene.

Doing It From Scratch

The Unity package downloaded earlier has a working example of all the pieces assembled together. But if you want to redo each step for yourself to better understand how everything interacts, here is a step-by-step list of operations:

  • Create a new scene
  • Copy the 360 equirectangular shader (360Equirectangular.shader) into the Assets folder. Have a look inside to familiarize with the simple math involved.
  • Create a new Material named InsideSphere.
  • Change the InsideSphere shader to 360/Equirectangular so it uses the shader you’ve imported earlier.
  • Create a Sphere using the GameObject->3D Object->Sphere menu.
  • In the Sphere’s Mesh Renderer, set the material to use the InsideSphere material created above.
  • Set the Main Camera Position to 0, 0, 0 so it sits in the center of the sphere.
  • Copy the MouseFollow.cs script into your Assets folder.
  • Drag-and-drop the MouseFollow script onto the Main Camera so the mouse movements will control the camera orientation.
  • Copy the 360_test_foggy_park_001.mp4 video into the Assets folder.
  • Drag and drop the Video Clip onto the Sphere in the Hierarchy view. This will add a VideoPlayer component to the Sphere with the clip already set as its source, and that targets the current game object’s renderer.

If the 360 video clip has audio in it and you want to hear it:

  • Set the VideoPlayer’s Audio Output Mode to Audio Source
  • Add an AudioSource component on the sphere
  • Drag and drop the Audio Source (by clicking on the Audio Source title label) into the Audio Source field in the VideoPlayer editor.

The Equirectangular Shader

The shader used to texture the video into the sphere (360Equirectangular.shader) uses a cartesian-to-spherical coordinate conversion as described here:

The fragment’s normal is used, instead of its coordinates, in order to produce texture coordinates, which is exactly the inverse operation that was done when transfering the pixels from the 360 camera into a rectangle frame. The calculation is simplified by normalizing the sphere radius to 1 since ultimately we only want the inclination and azimuth angles to convert them back to [0, 1] values to provide texture sampling coordinates. This saves a few divisions in the calculation.

The trick that performs the normal inversion hinted at earlier in this post is taken care of by the Cull Front directive at the top of the shader pass. This is one way to have the objects rendered “inside out” as described here: Unity - Manual: ShaderLab: commands

Final Note

I’ll probably convert this into a blog post when Unity 5.6 is officially released. Feel free to make suggestions on how this could be improved to be clearer or more useful.

Dominique
A/V developer at Unity

4 Likes

I ran the demo you set up
The shader seems to be problematic as it shows the video as one big green sphere or light brown.
Whenever the material shader is set to standard it shows the video’s texture (but ofcourse not in a good way)

What platform did you try this on? I’ve done this on OSX in case it helps. I’ll try it on Windows when I have time.

Hi again,

I updated the content. On Windows, the Graphics API for the project needed to be set to DX9, and I updated the bundled movie. The original one didn’t play on my Windows machine, we’ll figure out why later on.

Sorry for the trouble!

Dominique

No worries!
Happy to report the issues, I was indeed using a Windows 10 machine.
Got my fingers crossed for the compatibility and stability of the video player on all platforms.

Thanks for implementing the new player. I’ve been testing the 360 video playback with my own shader and code and it’s working well. I’d like to stream the content from a URL for one of my projects and I have some questions.

  1. Will there be anyway to retrieve the video metadata (video length, height, width, etc) for the video when streaming from a URL?
  2. When seeking while streaming does the player need to download the video before being able to seek forward? If the video is an hour long will the player need to buffer an hour of video before being able to seek that far ahead?

Many Thanks,
Morfaine

Hi Morfaine!

You have two good questions and I think I can help with both.

When streaming from a URL, the video information becomes ready when the preparation is completed. The VideoPlayer has API entry points exactly for the purpose of waiting for the information to become ready. Of course, this information is ready immediately when what you are playing is a VideoClip, but not for a URL as you have found out. Here’s an example of what you could do:

public class WaitForInfo : MonoBehaviour {

    void Start ()
    {
        var vp = GetComponent<UnityEngine.Video.VideoPlayer>();
        vp.prepareCompleted += Prepared;
        vp.Prepare();
    }

    void Prepared(UnityEngine.Video.VideoPlayer vp)
    {
        var width = vp.texture.width;
        var height = vp.texture.height;
        var duration = vp.frameCount / vp.frameRate;

        // Fake entry point...
        UpdateTheUserInterfaceWithThePlayerInfo(width, height, duration);

        // Play immediately if this is what you want.
        vp.Play();
    }
}

Note that for now there is no notification about buffering so while you wait for the information to become available (which could be arbitrarily long for a URL), all you can do for now is implement some form of spinning cursor with a timeout to avoid waiting forever. We’ll improve this in a subsequent release.

This part of the implementation is done by the native libraries we use so we have no control over this. Some of the implementations we have seen make use of mechanisms such as http range requests to be able to fetch arbitrary parts of the file without having to download it completely. This is also a function of how the server supplying the content is implemented. So you will have to experiment to find out. We will probably gain more information about this over time and be able to share this in per-platform notes at some point.

One thing I can tell you however is that the software VP8 implementation we are doing will download the whole file before playing back. This implementation is used on all platforms except Android. We’ll of course improve this later on, but at least here you know exactly what to expect!

Hope this helps,

Dominique
A/V developer at Unity

Many thanks for the reply Dominique.

I was expecting the VideoPlayer.clip variable in the Videoplayer instance to be updated to a new VideoClip instance containing all the metadata when the isPrepared flag was true. I see from your example code that I need to query the VideoPlayer texture and frame variables instead. Thanks for the heads up.

Thanks for the info. Will you will be implementing HLS or DASH support in the the new VideoPlayer? This should help with the seek features I’m interested in.

Can you include a video with sound? I would like to see a 360 video working with spatial audio correctly set up please

Very good point Matt, this video is a place-holder for actual original content being produced, I’ll make sure the next version has a usable audio track in it.

Dominique

1 Like

How would you go about doing this for stereo content in VR where one part of the video needs to go to one eye specifically and another part of the video needs to go to the other eye specifically?

I am having trouble figuring out how to essentially cut the frame into two sections that are then rendered on the proper eye cameras.

Good question, and good idea for a follow-up demo scene.

I haven’t tried this myself, but this blog post explains how to do it with the MovieTexture, essentially using the tiling features of texture parameter inputs: http://bernieroehl.com/360stereoinunity/

So here, you would

  1. create one sphere per eye
  2. have the VideoPlayer produce its output into a RenderTexture (it has a render mode for this).
  3. use the 360 Equilateral shader described in my demo on each sphere
  4. use the RenderTexture as input to both sphere’s, with each sphere’s texture parameter tiling controls set as described in the blob post I am pointing to.

Post your results here if you have time to try this out!

Dominique

I’m getting this error when importing the mp4:

I couldn’t find any 360 specific codecs so I downloaded the VLC 360 player, updated my video codes with k-lite, I still have the error. It seems to be related to the audio track only. I can play the video with Windows media player, VLC, etc, but there is no audio.

  • Windows 7 64bit
  • Unity 5.6.0f1

Any ideas?

I’m going through the Equirectangular Shader, but here’s the problem I’m seeing right now.

  • The airshow.mp4 gave me encoding erros, see previous post.

  • I rendered a 360 video with VR Panorama 360 PRO Renderer (https://www.assetstore.unity3d.com/en/#!/content/35102) and imported the MP4 into Unity, I get an encoding error about video track and 1cva.

  • I uploaded the video to youtube (
    https://www.youtube.com/watch?v=e9H8Xqbsl1I
    ) and then I used Youtube’s option to “Download MP4”. This file works in Unity (no errors) but there scene does not look correct (upside down/etc) and rotation does not work correctly:

  • I downloaded the “Surfers Video” from VLC: Index of /~jb/Builds/360/

  • This video gave me no errors again, audio even plays in Unity, but I have the same upside-down and no rotation error:

Thanks for any help!

I’ve got it working using basically an Unlit > Texture shader and everything works great in the editor:

The only problem now is that when I export to Android the “Video Player” only shows up as black, so I don’t see the video.
Again, all looks and works great in Editor, but on Android it will only show a black texture. Thanks!

I’ve got this to work on Windows using DX11 by changing the shaders frag function. The argument should be v2f instead of float3. Still works on OSX as well.

fixed4 frag (v2f i) : SV_Target
{
    float2 equirectangularUV = ToRadialCoords(i.normal);
    return tex2D(_MainTex, equirectangularUV);
}
6 Likes

Does the new Video Player also support HLS or DASH streams for 360?

Selzier, if you check the property of file Surfers_360.mp4 you see ‘width of frame=2048,height=1024’.
But if you check the property of file airshow.mp4 you see ‘width of frame=,height=’. These metadata are absent in this file and it’s a reason,why unity can play some files and can’t other

I tried to use the video given in the sample code (airshow.mp4) but it simply doesn’t work. Something is wrong with the file.
I tried using a video I downloaded from elsewhere but for some reason it shows upside down in Unity and when I build for Android it doesn’t show at all.

I have managed to get it to work using 5.5 with a simple sphere and texture but not here.

Guys… I’m using unity5.6.0b9 , and all i want to know is does new video player option in unity5.6 supports Android or Not…Because i tried it different times but at the same in the unity documentation i saw that at present it only supports for Standalone and webgl…and also unity recognized it as a bug : Unity Issue Tracker - [Android] VideoPlayer not playing video

All i want to know is for now, does unity 5.6.0b9 support video playback for android???

Regards
sai.