Using WebRTC package as a client for Unity Render Streaming

Hello,

Apologies if this is a stupid question, but is the WebRTC package appropriate to be used as a client for remotely viewing another Unity app, using Unity Render Streaming? We would like to use it for Edge Rendering, i.e. render a scene on a PC then view it on an Android/iOS device running our client app.

We can’t just use a browser because we want to add extra details on the client, and make the whole process invisible to the user.

If so, what is the sample in the WebRTC package closest to being able to do that (I don’t see any that obviously set up a remote URL)?

Thanks,
David

About the WebRTC package, the mobile device support has been worked on since last month. iOS platform support is already published as version 2.3 and Android support will be released in February or March.

3 Likes

Hi Kazuki, thanks very much for your reply.

The Android support coming is great news - we are looking for a cross-platform solution.

I was more asking if this is an appropriate use of WebRTC - is it already compatible with Unity Render Streaming out of the box? Which of the WebRTC sample scenes would be a good starting point for that?

Currently, we are preparing the new version of Unity Render Streaming which supports iOS platform. It should be released in February.
The “VideoReceive” scene in the WebRTC sample is the simplest to demonstrate video streaming. But this sample works locally only, you need to implement the signaling process yourself.

https://docs.unity3d.com/Packages/com.unity.webrtc@2.3/manual/index.html#samples

Ok, that’s good to know, thank you.

I’m looking for a bit of guidance on how to do exactly that (implement the signaling) - half of it will have been done already by Unity Render Streaming’s webserver, right? So I would need to know what to get and push to start getting the video stream from there?

At first, I guess you should try the demo project.
https://github.com/Unity-Technologies/UnityRenderStreaming/blob/develop/com.unity.template.renderstreaming-hd/Packages/com.unity.template.renderstreaming-hd/Documentation~/en/tutorial.md

If you want to use the signaling server in Unity Render Streaming, I recommend reading documents.
https://docs.unity3d.com/Packages/com.unity.renderstreaming@2.2/manual/webapp.html

Hello, I see there aren’t any official builds with android support, but there is an experimental branch. Can I somehow use it?

EDIT: I managed to use the experimental plugin, but on my android device I get an exception “Unable to load dll ‘webRTC’: the specified module could not be found”. Is there anything I can do about it?

@aksik

  • Set Player-> Android-> Other Settings-> Scripting Backend in Project Settings to IL2CPP
  • Set Player-> Android-> Other Settings-> Target Architectures in Project Settings to ARM64
  • Make sure you have the libwebrtc.aar file in Plugins / Android
    Unity Render Streaming currently only supports ARM64.

That worked! Thank you so much!

Will there also be webGl support? Soon or later?

Currently, we don’t have a plan to support WebGL.
But it would be possible to change the priority if we get requests from more users.

1 Like

I would like to lodge a request for WebGL support… :slight_smile:

2 Likes

Please test it if you are interested.
https://github.com/Unity-Technologies/com.unity.webrtc/pull/478

1 Like

Hi,
I want to followup on “dtaddis” request here.
I have more or less the same request and I hope that someone can guide me.
My goal is to create an app which has AR on client-A and streams this to client-B. In the same time client-B should be able to stream his cam to client-A and bidrectional on both clients with Audio. client-A should receive events from client-B. Client-B should be Browser or WebGL, client-A IOS/Android.
Basically it seems that Unity-WebRTC Package does match the requirements, but I cannot read if it is compatible with AR and signaling process is missing. If I screen Unity-Render-Streaming, I’m confused and not 100% sure if bidrectional Video/Audio Browser<>iOS/Android is working.
Does anyone knows more about these two packages and may guide me to the right solution?

Hi ,
Could you please let me know, if AR screen sharing is possible using webrtc in unity for Android and if it’s available.
I am stuck with the signalling server part of the webrtc could someone provide me an implementation example which I could use as a starting guide

1 Like

Is it possible to create an android game and play it on an android phone while streaming on another phone ?

Thanks! I’ll test it. WebGL support would be great!

Can I stream a camera from one Unity project and see the rendered image in another Unity project? I can’t figure out how to make render streaming work and keep seeing “Signaling: WS connection closed, code: 1006” appear repeatedly.

I’ve searched for hours and can’t find any explanation or documentation that shows how to get any of the examples working.

You need to launth the webapp with “w” option to use WebSocket protocol.
https://docs.unity3d.com/Packages/com.unity.renderstreaming@3.1/manual/webapp.html

1 Like

I want to do almost the same thing as @simu3105 but haven’t found any tutorial/sample on the web. I want to create a cross platform video chat app between IOS/Android and Browser using WebRTC. I have already implemented the browser’s implementation and the Unity implementation using video streaming with WebRTC but I am confused about how to connect them both using a signaling server / Unity render streaming.

@simu3105 did you solve your problem? If yes, then can you provide any solution on how you did this? Thanks.