The Video Renderer component from Microsoft’s MixedReality-WebRTC library doesn’t have a source field where I can specify from where I want the video renderer to take the video. I want it to take it from another component that represents the webcam since I’m trying to make a basic WebRTC connection. I’m following the tutorial in Adding local media | MixedReality-WebRTC Documentation , in that tutorial the Video Renderer Component has a Source field, but mine doesn’t have one.
I also met this issue. This has changed in the new MixedReality WebRtc but the tutorial was not updated. Before that, the source was specified in the video renderer as mentioned in the tutorial. But now the renderer doesn’t have that source attribute. You must specify what happens when the source stream start and or stop from the source itself.
So go to the component that holds your WebcamSource. You should see a Video Stream Started list. Add one action to the started list. You must then select the GameObject that holds your VideoRenderer and set VideoRenderer.StartRendering. Do the same for StopRendering in the VideoStreamStopped list. This solved the issue for me.
Thanks! I was able to add an action to the Video Stream Started and the Video Stream Stopped lists in the WebcamSource. What I added were functions that are inside a script, but they are empty because I don’t know what to put inside of them yet. What I understand is that these functions will be activated after the WebcamSource is started or stopped respectively, correct me if I’m wrong.
Then I tried creating another script containing VideoRenderer.StartRendering, but I got errors. I also tried using something I saw in Microsoft’s documentation: StartRendering(IVideoSource source) , but I got errors again. I think I’m kind of confused because I don’t know what triggers what. For example, does the VideoRenderer.StartRendering triggers the Video Stream Started in the WebcamSource? Also, I don´t know how to set VideoRenderer.StartRendering, could you show part of your code please?
Ok, I made some changes and I think I’m getting closer. Below is my code. But I’m still getting an error, it’s in the part where I define the source of the VideoRenderer.StartRendering() function. The error says: error CS1503: Argument 1: cannot convert from ‘Microsoft.MixedReality.WebRTC.Unity.WebcamSource’ to ‘Microsoft.MixedReality.WebRTC.IVideoSource’ What is the IVideoSource type? What would be an example of something that is from that type?
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Microsoft.MixedReality.WebRTC.Unity;
using System.Diagnostics;
public class WebCam_Script : MonoBehaviour
{
public GameObject LocalVid;
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
public void Update()
{
if (Input.GetKeyDown("space"))
{
print("space key was pressed");
StopWebCameo();
}
}
public void StartWebCameo()
{
print("WebCam Started??");
StartRendeo();
}
public void StartRendeo()
{
WebcamSource wcam = LocalVid.GetComponent<WebcamSource>();
VideoRenderer.StartRendering(wcam);
}
public void StopWebCameo()
{
//WebcamSource wcam = GetComponent<WebcamSource>();
//VideoRenderer.StopRendering(wcam);
}
}
Create new script and add methods to start and stop rendering and pass Microsoft.MixedReality.WebRTC.Unity.WebcamSource as source.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class MyScript : MonoBehaviour
{
public Microsoft.MixedReality.WebRTC.Unity.VideoRenderer videoRenderer;
public Microsoft.MixedReality.WebRTC.Unity.WebcamSource webcamsource;
public void startVideoStream()
{
videoRenderer.StartRendering(webcamsource.Source);
}
public void stopVideoStream()
{
videoRenderer.StopRendering(webcamsource.Source);
}
}
Attach this new script to some gameobject and attach your VideoRenderer and WebcamSource as Reference.
Go to your WebcamSource component and attach your script and assign the respective methods.
This should give no errors and should make the tutorial working.
Thanks @unity_H9rj9tkQr45s3Q . That worked perfectly for the local peer. Then I worked on the remote peer and tried to establish the webrtc connection but the remote video was not displayed (The webrtc connection seems to be fine, the offer message and answer message look good), so I wonder if my remote peer setup is wrong. I did pretty much the same that I did for the local peer. First I created this receiver script:
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using Microsoft.MixedReality.WebRTC.Unity;
using System.Diagnostics;
public class ReceiverScript : MonoBehaviour
{
public Microsoft.MixedReality.WebRTC.Unity.VideoRenderer videoRenderer;
public Microsoft.MixedReality.WebRTC.Unity.VideoReceiver videoReceiver;
public void startVideoStream()
{
videoRenderer.StartRendering(videoReceiver.VideoTrack);
}
public void stopVideoStream()
{
//videoRenderer.StopRendering(webcamsource.Source);
}
}
Then I attached it to the same gameobject that contains the previous script (the local video script) and attached as reference a new VideoRenderer and a VideoReceiver that were created in a RemoteVideoPlayer object.
But it doesn’t work. Maybe the problem is that my Peer connection component is missing some information? I don’t know what else to add in that component