Playing a video on a holographic "window" on the HoloLens

After having the usual search around here and the internet… I couldn’t find anything that’d solve my problem so I thought I’ll try posting on here.

Here’s my problem: I am learning how to make a holographic application for Microsoft HoloLens and I want to have a video playing on a “window” (currently modelled by Unity UI elements having a hierarchy of Canvas > Panel > RawImage) when the application is launched. The window shouldn’t follow the user’s gaze so they can still look around and away if they wish.

Right now… I can’t quite get that to work. I can either have the video playing full screen with no audio by using Unity’s VideoPlayer and change the Render Mode to “Camera Near Plane”, or have the audio playing with a script adapted from this Stack Overflow post I found (see below). As far as I’m concerned, when the scripting method is used, the video is playing (hence the audio plays) but I just can’t get it to display on the canvas/raw image. Here’s the adapted script:

public class PlayVideo : MonoBehaviour
{

    [Tooltip("Raw image to show the video images")]
    public RawImage image;

    [Tooltip("Video to play")]
    public VideoClip videoToPlay;

    private VideoPlayer videoPlayer;
    private VideoSource videoSource;

    //Audio
    private AudioSource audioSource;

    // Use this for initialization
    void Start()
    {
        Application.runInBackground = true;
        SetUpVideo();
    }

    void OnPrepareCompleted(VideoPlayer videoPlayer)
    {
        videoPlayer.Play();
        audioSource.Play();
    }

    private void SetUpVideo()
    {
        //Add VideoPlayer to the GameObject
        videoPlayer = gameObject.AddComponent<VideoPlayer>();

        //Add AudioSource
        audioSource = gameObject.AddComponent<AudioSource>();

        //Disable Play on Awake for both Video and Audio
        videoPlayer.playOnAwake = false;
        audioSource.playOnAwake = false;

        //We want to play from video clip not from url
        videoPlayer.source = VideoSource.VideoClip;

        //Set video To Play then prepare Audio to prevent Buffering
        videoPlayer.clip = videoToPlay;

        //Set Audio Output to AudioSource
        videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;

        //Assign the Audio from Video to AudioSource to be played
        videoPlayer.EnableAudioTrack(0, true);
        videoPlayer.SetTargetAudioSource(0, audioSource);

        image.texture = videoPlayer.texture;

        //Add callbacks for when video is prepared
        videoPlayer.prepareCompleted += OnPrepareCompleted;

        videoPlayer.Prepare();
    }
}

Any pointers or help would be appreciated! Thanks!

You need to choose API Only for video rendering mode, not Camera Near Plane.
Also, assign your Audio Source to the relevant field so Unity can initialize it at start up.

Setting it to API only for the video rendering mode seems to have done the trick, as I later found out the texture isn’t available under videoPlayer.texture if it’s at other rendering mode… silly me! Also just a note for anyone reading this in future, the audio source is being initialised and set on line 52, as the audio comes from the video which is provided in the editor through the videoToPlay public field.

Thanks for your help!

Glad you got it working. I’m using almost identical method for my game, so I had some time to experiment :slight_smile: