Hello everyone,
I am working with an edited version of the WebApp sample specifically the bidirectional sample) provided in the Unity Render Streaming package. The goal is to stream a video file from the WebApp to Unity using WebRTC. On the Unity side, I am using the ReceiverSample.cs script to receive the video stream and apply it as a texture. However, despite the stream being successfully received (as confirmed by logs), I am unable to visualize the stream outside of the canvas, and the view eventually turns completely black.
I tried following this video: https://www.youtube.com/watch?v=UiDlTmi8bcQ and also ChatGPT’s help to no avail.
I’d like to kindly request a step-by-step how to properly render the video stream being received on a sphere or skybox instead of rendering it on a square in the canvas.