Skipping to video frame not working on Android

I’ve implemented a video recorder and want to save a preview image. For this, I’d like to take the middle frame of the video. This works perfectly on macOS and iOS, but not on Android.

Here’s my code:

void ExtractVideoPreviewTexture() {
        videoPlayer.sendFrameReadyEvents = true;
        videoPlayer.frameReady += OnVideoPreviewFrameReady;
        videoPlayer.Pause();
}

void OnVideoPreviewFrameReady(VideoPlayer source, long frameIdx) { 
        if (frameIdx == 0) {
            // Skip to middle to prevent initial artifacts in preview
            var middleFrame = (int) (source.frameCount / 2);
            if (middleFrame > 0) {
                Debug.Log($"Skipping to middle frame {middleFrame}");
                videoPlayer.frame = middleFrame;
                // videoPlayer.Pause(); Adding this won't work either
                return;
            }
        }
        videoPlayer.frameReady -= OnVideoPreviewFrameReady;
        // ... copy texture ...
        videoPlayer.frame = 0;
}

The problem is that the “frameReady” event is only called on the initial frame 0, but not on the middle frame.

(I skipped the prepare step in the code which is done earlier. The player is configured to not auto-play, and the rendering mode is set to API only.)

Shameless bump.

Hi!

This certainly looks like it should work, and I’m not seeing anything like this in currently known issues. It’d be interesting if you could submit a bug so we can see what’s going on.

In the mean time, instead of relying on Pause(), maybe you can seek to the mid point and then (when the seekCompleted event is fired), invoke Play(). This will get into the “normal” code path for frameReady events, and you can call Stop() as soon as you’ve received the one frame you want.

But this is only a workaround; hope this helps nevertheless,

Dominique Leroux
A/V developer at Unity

Thanks for your feedback, will try it out!

@DominiqueLrx I tried your approach, but the result is not really satisfying. This VideoPlayer API is pretty weird and full of unexpected pitfalls.

Here’s the task again I want to accomplish:

  • Once the video is prepared, seek to its middle
  • Then, extract the frame and copy it to a texture (to be saved as a video preview)
    Seems simple, but isn’t.

If I just seek to the frame and then try to get the texture if seekComplete fires, the frame won’t be available yet. So I learned that I have to wait for frameReady, too. I first tried to use Pause() for this, but this behaved differently on my devices. On my Xiaomi Mi9 it worked, but on a Samsung S10 the frameReady event was fired twice for the same frame (for whatever reason). This is particularly interesting, as my frameReady handler clears the frameReady event subscription immediately. To my understanding, a second (unwanted) event fire should not happen in this case.

So I followed your suggestion more closely and used Play() instead of Pause(). This kinda works, but the problem is that Play() takes over 2 seconds to actually fire the frameReady event. This leads to an inacceptable delay in our UI. Pause() fired nearly in an instant, but, as I said above, was unreliable.

Is there a reason why Play() takes that long, even if the seek has already been done? The seek itself is performed in 0.2s, but the actual Play() command takes 2.5s until frameReady fires.

As an alternative: Is there any other way to wait until frameReady fires without playing the video back? Like “seeking for the frame while the player is still paused”? This is what I would actually need.

(Note that I also muted the audio track before playing, to avoid unwanted noise. I haven’t benchmarked the effect of this.)

Yes, providing a unified facade to all these different platforms and devices own personalities is a constant battle…

Totally, and I invite you again to submit a bug with the method you are using so we can replicate the problem and fix it. You shouldn’t have to crunch over all these workarounds like you sadly have to do here.

This is expected behaviour. We distinguish between the movie having been decoded up to the wanted point, and the actual pixels having reached the texture.

Agreed. Please make this part of your bug report if possible (or a separate bug, if you prefer). But apart from being weird, I’m not understanding what makes this unreliable. Is it because the pixels were not present in the first call to frameReady? Can you share what parameters are received in the two separate invocations of frameReady? (e.g.: if the frame index is invalid in the first invocation, maybe you can use this to detect that more frameReady events are expected for this frame). Not saying this is expected at all: just trying to see if we can find a short-term workaround for this.

Here, I’d really have to take a look at your logic. It could be that the VideoPlayer starts buffering from 0, then sees the seek and re-buffers from the new seek point. It could also be that there are very few keyframes in the video and that seeking ahead like you do forces complete decoding in the background. But if it were the case, the time between Pause() and seekCompleted would also be long.

First of all, thanks for your detailed explanations, highly appreciated!

The problem here is that I use the event to trigger follow-up events in my own codes. They do the actual processing of the image and handing it over to the next processing steps in my app which eventually uploads the image to an S3 backend. After all of these steps are done, the video player and all intermediate content used in the processing steps (textures etc) are destroyed to save resources.

If the events are fired twice, the second run goes wild, because the first run already cleared things up. Hence, the only solution to this would be a weird hack that checks if the handler has been called twice and ignore the second run. Hoping, that the actually wanted data is already available in the first invocation.

Here’s my full code:

    public void PrepareVideoForPlayback() {
        Benchmark("Prepare Video Player for " + media.path);
        videoPlayer.url = media.path;
        videoPlayer.source = VideoSource.Url;
        videoPlayer.renderMode = VideoRenderMode.APIOnly;
        videoPlayer.prepareCompleted += OnVideoPrepareCompleted;
        videoPlayer.Prepare();
    }

    void OnVideoPrepareCompleted(VideoPlayer source) {
        Benchmark("Video Player prepared");
        videoPlayer.prepareCompleted -= OnVideoPrepareCompleted;
        videoPlayer.sendFrameReadyEvents = true;
        videoPlayer.seekCompleted += OnVideoSeekCompleted;
        var middleFrame = (int) (videoPlayer.frameCount / 2);
        if (middleFrame > 0) videoPlayer.frame = middleFrame;
    }

    void OnVideoSeekCompleted(VideoPlayer source) {
        Benchmark("Video seek completed");
        videoPlayer.seekCompleted -= OnVideoSeekCompleted;
        videoPlayer.frameReady += OnVideoPreviewFrameReady;
        videoPlayer.SetDirectAudioMute(trackIndex: 0, mute: true);
        videoPlayer.Play();
    }

    void OnVideoPreviewFrameReady(VideoPlayer source, long frameIdx) {
        Benchmark($"Frame {frameIdx} ready");
        videoPlayer.frameReady -= OnVideoPreviewFrameReady;
        videoPlayer.Pause();
        ExtractVideoPreview(source);
        videoPlayer.frame = 0;
        videoPlayer.SetDirectAudioMute(trackIndex: 0, mute: false);
        OnVideoPreviewTextureExtracted();
    }

    void ExtractVideoPreview(VideoPlayer source) {
        Benchmark("Extracting preview texture");
        RenderTexture renderTexture = source.texture as RenderTexture;
        media.texture = new Texture2D(renderTexture.width, renderTexture.height);
        RenderTexture.active = renderTexture;
        media.texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
        media.texture.Apply();
        RenderTexture.active = null;

    }

This sequence was the only way that worked on all devices (and I tried a lot of combinations), but leads to the unwanted 2s delay before the frameReady was actually fired.

I am also having an issue on android where the playhead is not updating. Works fine in editor. Not using the API route. All I’m doing is setting the time and calling pause in update and then letting the Videoplayer render to the render texture.

Works fine in editor but on android it never updates the render texture.