Hello everyone,
I’m currently working on a project where I need to stream the live video feed from a Unity camera in real-time. I have been struggling with finding the best approach to achieve this and would greatly appreciate any guidance or suggestions.
My goal is to set up a real-time video stream from my Unity camera and send it to a specific URL address. Currently, I’m using FFmpeg to start the stream, but I’m struggling with how to send the packets of my camera feed. I have a RenderTexture as the output of my camera, and I’m wondering if it’s possible to leverage the RenderTexture and send its information directly.
I have considered using Unity’s networking APIs, such as the UdpClient class, to send the video frames over UDP. However, I’m unsure about the best way to convert the RenderTexture data to a format that can be sent as UDP packets.
I have also come across WebRTC as a potential solution, but I’m not sure if it’s the right approach for my case. If anyone has experience with Unity WebRTC or other similar solutions for real-time video streaming, I would greatly appreciate your insights.
Any help or suggestion would help,
Cheers!
I end up using a modified version of FFmpegOut by Keijiro
Just need to add the Stream Camera Capture component to the camera, set the protocol and the address and it works like a charm!
Can you get the stream from a non Unity source as well? I’m willing to pay for some consultation on this. PM me and we can discuss rates.
Yes you can set the stream with a UDP protocol and then create a UDP receiver at the source that receives the data and show the stream.
Thank you for the response! So I go to the fork and then do I make a package out of the project?
And will this work for a non-unity receiver?
Yes it will work with a non-unity receiver, you can try it with OBS.
1 Like
And how would I set up a Unity receiver? Thanks so far!
You need to create a UDP receiver, exemple here GitHub - codemaker2015/unity-udp-demo: UDP interaction demo in Unity 3D
Set the same address and port from the stream output and you should receive data coming from it. From there I’m not sure exactly what the format is but you will probably need to read the bytes and create a Texture from it. Unity - Scripting API: Texture2D.LoadRawTextureData
How well will this work if I need to stream 40 cameras real time? I need to live stream 40 cameras at the same time.
Hi, thanks for sharing this tool. Could you give me a hint on how to use this with OBS? I don’t know anything about networks, what I tried was this:
I entered a UDP address as shown in the example in the Stream Camera Capture component in Unity.

I then played the scene in Unity, and on the same PC opened OBS and created a Browser video source and entered the same UDP address in the URL box.
Clearly that doesn’t work because I’m misunderstanding things completely. I tried searching for tutorials online but couldn’t find anything useful. Any help would be greatly appreciated!
I’m trying to use the Stream Camera Capture but when I use the installation links on the github page they take me to the original keijiro unity packages and I can’t work out how get the Stream Camera Capture component.
What am I missing?