Multi-receivers watching one broadcaster's shared screen problem

Hello,

I’m currently having some questions about the sample scenes, Broadcast, and Receiver in particular.

This is my environment’s information:
Unity Editor Version: 2019.4.21f1 (LTS)
RenderStreaming version: 3.0.1-preview
Environment: Windows 10 64bit, NVIDIA GeForce GTX 1650 Ti

So, let’s jump into the basic scenario first. (I’m using Photon for networking)

I’m having a scene that contains 5 people, 1 of them is the Host (Master Client) who has the right to share his/her screen with others. I found that the Receiver and Broadcast samples are the two that can be applied to my project. However, it’s having a little problem when the 3rd person joins into the room, there always be an error that says InvalidOperationException: Sequence contains no matching element whenever I try creating connection through SingleConnection on the 3rd person client.

In terms of the webapp, I merely served it as shown from the Github page without any changes and run in the socket mode. Basically, there is no problem on the web server.

So here’s my setup in Unity, I did not change anything from the package and leave it as is:

7488035--921665--upload_2021-9-11_20-34-0.png

As you see above, there are 3 primary GameObjects:
Streaming Camera: Holding the Render Texture to send through the network
Camera Streamer

7488035--921667--upload_2021-9-11_20-37-31.png

Streaming Renderer (Sender): Similar to RenderStreaming GameObject in the scene Broadcast but without the Audio and Input from Browser
Render Streaming
Broadcast

Streaming Renderer (Receiver): Similar to RenderStreaming GameObject in the scene Receiver but without the Audio and Input from Browser
Render Streaming
Single Connection
Receive Video Viewer
Streaming Receiver(*)

* Streaming Receiver: Based on the idea of ReceiverSample script

public class StreamingReceiver : MonoBehaviour
{
        #region Constants

        private const string _ID_CONNECTION = "<mywebserver>";

        #endregion

#pragma warning disable 0649

        #region Private Serialized Fields

        [Header("Render Streaming")]
        [SerializeField]
        private SingleConnection connection;
        [SerializeField]
        private ReceiveVideoViewer videoViewer;

        [Header("UI")]
        [SerializeField]
        private RawImage remoteVideo;

        #endregion

#pragma warning restore 0649

        #region Private Fields

        private string connectionId;

        #endregion


        #region MonoBehaviour Callbacks

        private void Awake()
        {
            videoViewer.OnUpdateReceiveTexture +=
                texture => remoteVideo.texture = texture;
        }

        private void Start()
        {
            if (string.IsNullOrEmpty(connectionId))
            {
                connectionId = System.Guid.NewGuid().ToString("N");
            }

            connectionId = _ID_CONNECTION;
            StartCoroutine(Connect());
        }

        #endregion

        IEnumerator Connect()
        {
            yield return new WaitForSeconds(4f);

            connection.CreateConnection(connectionId, true);
        }
    }

As you can see, I’m using Coroutine because there is no callback from the RennderStreaming which lets me know when the WebSocket actually connected to the web server.

Q1. Should I have to tweak something inside RenderStreaming script to achieve the point when it complete the connection?

Q2. Could the Broadcast connect to multi-receivers at a time? If yes, could you give me briefly the instructions about getting thing right, or at least, you can merely show me the document having the possibility to solve the problem, please?

Q3. Am I going the right way to achieve the expected result?

Finally, I really appreciate any of your help and sorry if I made any mistake in this post.

Well, I figured it out anyway.

My mistake for not noticing the connectionIds that should be actually different from each other in the code… I hardcoded it by the string of my webserver… at line 6 you can see…

Now my app runs smoothly anyway, it works like a charm. Even though there is not any help but if anyone needs help with the same use case, it would be my honor to help you guys.

So, the Q2 and Q3 are solved, I am really confused about the Q1, if there is any other method rather than tweaking inside the base code, it would be nice.

Bump
Q1. Should I have to tweak something inside RenderStreaming script to achieve the point when RenderStreaming completes its connection? How to know when the correct time to CreateConnection from the SingleConnection of the Receiver 'cause when I create a connection without waiting for the WebSocket to complete its connection, it will popup an error said that:

Signaling: WS is not connected. Unable to send message

I want to ask if there is any way to handle this.
Thank you.

Some users posted same issues on GitHub.
We have a plan to make a sample to explain how to handle multiple users on Unity Render Streaming.
https://github.com/Unity-Technologies/UnityRenderStreaming/issues/435

Thanks for your reply. I could handle that problem by using PUN 2 package combining with Cinemachine Virtual Camera. I actually could stack up to 20 CCUs at a time without any problem, or maybe my use case does not generate the same issue. However, I want to know whether any method we could get the complete time of the connection between the RenderStreaming and the WebServer. It runs on the other threads, maybe, so it’s a little bit challenge for me to get the exact time when the RenderStreaming finishes its connection.

Anyway, thank you!

Can I ask you why do you need photon if everybody is gonna see the same screen? Do you need it in order to enable a chat through photon voice?

I will answer your question in the conversation section. I’m sorry for being late. I hope that my information will be useful for you.

1 Like