How to send to RTMP endpoint

We’re trying to send the captured audio/video from Unity Render Streamer through to an rtmp endpoint (YouTube, Twitch, etc.). However, it’s been tough finding anything that doesn’t require additional resources that can’t be compiled with our Unity build.

Our previous solution was to use 3 different plugins to get a/v our through NDI and stream with OBS, but the plugins all have different support levels and now have versions out of sync. We’ve also tried several Asset Store packages, but they either have too much processor overhead or have the audio & video out of sync.

So, we’re hoping to find a solution via Unity Render Streamer. Anyone know of any plugins or solutions out there, or have a lead on how we might be able to say, attach the Unity Renderer Streamer streams to the ffmpeg dll to send to an RTMP endpoint…?

Thanks in advance for any time + thoughts!

1 Like

You can use SRS to covert WebRTC to RTMP, please see Integration with Media Server · Issue #33 · Unity-Technologies/com.unity.webrtc · GitHub

I appreciate the share @winlin_srs and I saw that, but it looks like you need to have the srs webapp still running + in docker as opposed to having the solution built-in to a unity build? I was also having a hard time finding the rtmp conversion info/example to test, though I saw it briefly mentioned in the original post…

If you can give me any info + point me in the right direction to address those two issues, I’d love to pursue this further!

Please follow the guide GitHub - ossrs/srs-unity: WebRTC Samples with SRS SFU server for Unity. step-by-step and please join discord SRS Community to discuss with us.

Unfortunately, the SRS solution won’t work with us since it requires an external resource/server.

If anyone has any further insights into how we can solve this, please do let me know. We’re looking to hire a freelancer and would be grateful for any insights the help us minimize the time we have that freelancer working on a solution from scratch ~

I can give some insight here, not sure if it’s going to be helpful to you though.

You mentioned platforms like YouTube and Twitch. These two platforms both have ingestion endpoints for WebRTC streams nowadays. In both cases these are not officially documented though. But both platforms use WebRTC as an ingestion protocol to allow users to stream directly from a browser. (E.g. https://youtube.com/webcam)

I’ve made some experiments to stream to Twitch directly using WebRTC and posted the results here . My verdict was, that it’s not yet robust enough. I got it to work on Android pretty reliably with a specific version of the WebRTC package. But on Windows I ran into the issue that Twitch requires a specific Profile Level of the h264 codec to be used and Unity’s WebRTC package doesn’t give you this fine control on which Profile Level is being used (at least as far as I can tell.) I’ve created an issue for this here on GitHub:

But I do think for streaming directly from an app to the ingestion platform without any server in between, WebRTC might be the best approach as opposed to RTMP.

2 Likes

This is really great info, thanks @julienkay ! I’ll try this out as well, but also take your word for it that it might not be ready.

Ahh I see, so this solution still requires spinning up a web application to do the conversion @julienkay ?

No web app involved. Goes straight from app to streaming platform. The youtube link was just to show they have their WebRTC backend working in production.

So you just put in their (new) webrtc endpoints into the existing Unity WebRTC plugin using Unity Render Streaming and it worked? From the video I thought you made a custom solution or that it required a webapp/server :slight_smile:

I guess I’m not clear where you get those endpoints since the article you linked to talks about running an AWS webserver

For Twitch, I used https://ingest.twitch.tv/ingests which returns the available endpoints in JSON format (This link if you want see the endpoints for your region in the browser). Then do the typical WebRTC Offer/Answer handshake with that endpoint.

As I said, the API endpoints are not really officially documented. And since signaling is not part of WebRTC, the platforms both implement their own HTTP based signaling process. I had to reverse engineer that part using the available browser apps https://youtube.com/webcam for YouTube and https://stream.ivs.rocks/ for Twitch (which uses the same backend as the AWS IVS solution). YouTubes signaling process seemed a bit more complex last time I’ve looked at it, so I didn’t pursue it further at the time, but for Twitch it was relatively straightforward.

1 Like

Sorry, I missed that part of your question. I actually don’t use the Render Streaming package at all, I use the WebRTC package directly and implement the signaling handshake with the streaming platform as outlined above.

Interesting… but then how are you capturing the audio/video being sent out?

Sorry for all the questions, but I’m unclear as to how you can send to one of the rtmp endpoints listed for twitch using webrtc? Will Unity’s WebRTC actually connect/handshake with an rtmp endpoint?

That’s part of the WebRTC package more than Render Streaming. The WebRTC package lets you capture any camera, which is what the Render Streaming package is also using as it’s built on top of the WebRTC package.

No worries. Yeah, that’s also nowhere documented, but every one of those RTMP endpoints has a matching endpoint for the WebRTC signaling. I.e.
rtmp://sfo.contribute.live-video.net/app/{stream_key}
has a matching endpoint
https://sfo.webrtc.live-video.net/4443/offer/

1 Like