Hello,
I’m currently having some questions about the sample scenes, Broadcast, and Receiver in particular.
This is my environment’s information:
Unity Editor Version: 2019.4.21f1 (LTS)
RenderStreaming version: 3.0.1-preview
Environment: Windows 10 64bit, NVIDIA GeForce GTX 1650 Ti
So, let’s jump into the basic scenario first. (I’m using Photon for networking)
I’m having a scene that contains 5 people, 1 of them is the Host (Master Client) who has the right to share his/her screen with others. I found that the Receiver and Broadcast samples are the two that can be applied to my project. However, it’s having a little problem when the 3rd person joins into the room, there always be an error that says InvalidOperationException: Sequence contains no matching element whenever I try creating connection through SingleConnection on the 3rd person client.
In terms of the webapp, I merely served it as shown from the Github page without any changes and run in the socket mode. Basically, there is no problem on the web server.
So here’s my setup in Unity, I did not change anything from the package and leave it as is:

As you see above, there are 3 primary GameObjects:
Streaming Camera: Holding the Render Texture to send through the network
Camera Streamer

Streaming Renderer (Sender): Similar to RenderStreaming GameObject in the scene Broadcast but without the Audio and Input from Browser
Render Streaming
Broadcast
Streaming Renderer (Receiver): Similar to RenderStreaming GameObject in the scene Receiver but without the Audio and Input from Browser
Render Streaming
Single Connection
Receive Video Viewer
Streaming Receiver(*)
* Streaming Receiver: Based on the idea of ReceiverSample script
public class StreamingReceiver : MonoBehaviour
{
#region Constants
private const string _ID_CONNECTION = "<mywebserver>";
#endregion
#pragma warning disable 0649
#region Private Serialized Fields
[Header("Render Streaming")]
[SerializeField]
private SingleConnection connection;
[SerializeField]
private ReceiveVideoViewer videoViewer;
[Header("UI")]
[SerializeField]
private RawImage remoteVideo;
#endregion
#pragma warning restore 0649
#region Private Fields
private string connectionId;
#endregion
#region MonoBehaviour Callbacks
private void Awake()
{
videoViewer.OnUpdateReceiveTexture +=
texture => remoteVideo.texture = texture;
}
private void Start()
{
if (string.IsNullOrEmpty(connectionId))
{
connectionId = System.Guid.NewGuid().ToString("N");
}
connectionId = _ID_CONNECTION;
StartCoroutine(Connect());
}
#endregion
IEnumerator Connect()
{
yield return new WaitForSeconds(4f);
connection.CreateConnection(connectionId, true);
}
}
As you can see, I’m using Coroutine because there is no callback from the RennderStreaming which lets me know when the WebSocket actually connected to the web server.
Q1. Should I have to tweak something inside RenderStreaming script to achieve the point when it complete the connection?
Q2. Could the Broadcast connect to multi-receivers at a time? If yes, could you give me briefly the instructions about getting thing right, or at least, you can merely show me the document having the possibility to solve the problem, please?
Q3. Am I going the right way to achieve the expected result?
Finally, I really appreciate any of your help and sorry if I made any mistake in this post.

