I’m using the native video player in Unity 2018, using a Render texture created at runtime and a raw image.
Everything is fine in the editor and on any standalone build. However, in WebGL the video is rendered black. The audio is audible though. I’ve recently migrated from Unity 5.6 to Unity 2018 - it was fine in the previous version.
Has anyone had the same problem or is there anything new in Unity that has changed the way videos work in webgl?
Thanks in advance
I resolved my problem. In my case, it was caused by the fact that I called the VideoPlayer’s API before the game object I placed it on was activated. This seemed to work (because I could listen to the audio), but the video was not visible. As soon as I placed the video player to another game object to activate it right away, everything worked as expected. It is rather counter-intuitive that it kinda worked (i.e. was playing audio but not video). This made me think that there was another problem with my setup.
This was about playing the video whereas target render texture and video itself were not prepared.
Assign targetTexture to an instance of a Render texture (also, be careful here with dimensions and depth), then call, video.Prepare. Make sure you bind prepareCompleted events to the object and play the video just then.
The video rendering in WebGL build issue resolved at my end. This work in Unity 2019.4 LTS versions. Note:- I have observed that mp4 video file rendered in WebGL build (not wmv). If any have other types of file format status please update here.
I have a similar issue with the render texture remaining black but the sound playing. I got some Image targets attached with a prefab of a video player. When the image target is tracked the video needs to be played.
It seems this issue was fixed in Unity 2019.4 LTS. You should probably open your own thread with your target OS, Editor version, and so on. Does it happen in your build or only in the Editor? Are you sure OnTargetFound is called?
Another solution, on the same Game object you should move the Raw Image up higher than the video player. Like this when the game object activated then the raw image first will be loaded then the video player will start the video properly.
I had an issue where the ‘RawImage’ worked fine in the editor, but when building for Android, it showed up black while the video played with audio.
As a context, I’m making a VR app, using the Oculus Integration SDK and needed to play a series of videos. I followed the steps exactly as shown in the video and to solve my problem, I assigned and created the ‘RenderTexture’ through code and assigned it to the ‘RawImage’. Here’s the code: