How to receive Audio? How to play received Audio?

Hey
im currently working with the WebRTC 2.4.0-exp.1 packageg … and managed to send Camera Video from one Unity Client to another Unity Client.
Also looked at the Datachannel example and managed to send an AudioStream to the other Unity Client … but i dont know how to play the received Audiostream!

Currently i have something like that.

receiveStream = new MediaStream();
        receiveStream.OnAddTrack = e =>
        {
            Debug.Log("On Add Track");
            if (e.Track is VideoStreamTrack track)
            {
                receiveImage.texture = track.InitializeReceiver(1280, 720);
               
            }
            if (e.Track is AudioStreamTrack atrack)
            {
               
                Debug.Log("RECEIVED AUDIOSTREAM");
            
            }
        };

but in the Type AudioStreamTrack there is nothing like Initialize receiver like with VideoStreamTrack?

How can i assign the stream to a audio source?

Thx for any help!

@mrSaig can you confirm for me if you get SDP offer/answers each time you add a new track? I’m getting the session connected between two Unity PC’s and the add track works, I also see the UDP packets but the raw texture does not update. Then to answer your question on audio, do the following:

private void OnAudioFilterRead(float[ ] data, int channels)
{
Audio.Update(data, data.Length);
}

OR

private void Start()
{
AudioRenderer.Start();
}

private void Update()
{
var sampleCountFrame = AudioRenderer.GetSampleCountForCaptureFrame();
var channelCount = 2; // AudioSettings.speakerMode == Stereo
var length = sampleCountFrame * channelCount;
var buffer = new NativeArray(length, Allocator.Temp);
AudioRenderer.Render(buffer);
Audio.Update(buffer.ToArray(), buffer.Length);
buffer.Dispose();
}

@WayneVenter Thanks will try your audio solution!
If you get the track and you texture simply does not update,
you probably forgott on one or both sides the webrtc update call

 StartCoroutine(WebRTC.Update());

@

Hmm you Audio solutions are for recording only right? i have Audio Update already in my code … but my problem is the other side … audio output

@mrSaig
Audio rendering has not been implemented yet.
Audio renderer support will be supported in 2.4.0-exp.3.
https://github.com/Unity-Technologies/com.unity.webrtc#roadmap

1 Like

Hi @mrSaig Thank you for this, it solved my problem, I now have video streaming working between to PC’s

1 Like

@mrSaig Yes, was for Recording, currently what I am doing to solve this problem is use another package from FROZEN MIST https://assetstore.unity.com/packages/templates/packs/fmetp-stream-143080, it works really well but the Video bandwidth usage is insane, so I need WebRTC VP8 encoding to deduce my bandwidth needs. I see from the release notes we will only get audio out in July/August, so my suggestion would be to try the audio encoder from Frozen Mist and stream the data over the dataChannel in WebRTC or just direct via the WebSocket and keep the video on WebRTC.

1 Like

ah okay thank you for clarification! will wait for it then :wink:

7259494--875737--upload_2021-6-22_15-50-19.png
If so, can I receive audio using this method?

I would like to know an example of sending and receiving Audio via AudioStream . I would like to know the specific sending part and receiving part.

Audio renderer support has changed to 2.4.0-exp.4.

Thank you very much for your reply. gtk2k.

Many applications on the Internet allow you to get audio from video. Plus you don’t have to pay any money for it. But if you decide to do it yourself, you can find a code in the public domain that will enable you to do it. Modern technology amazes me because video sound is very high-quality today. I use https://productz.com/en/yamaha-yas-209/p/8VE1l while watching a movie. Sometimes I feel like someone is talking next to me, as this soundbar produces sound in three-dimensional quality.