Accessing received WebRTC AudioStreamTrack audio data

Hello,

I’m using WebRTC 2.4.0-exp.5 inb my project, and I’m trying to use audio received from WebRTC and use it as the input for OVRLipSync.

My code is similar to the one in the reference here Audio streaming | WebRTC | 2.4.0-exp.11, and I can receive and listen to the audio, but OVRLipSync doesn’t seem to be able to use it when setting the same AudioSource as input for it.

I wanted to try to use the PCM/float data from WebRTC’s AudioStreamTrack instead, but it seems this isn’t possible. Could anyone tell me if there is a way to do this currently?
I also tried using Unity’s OnAudioFilterRead, but it doesn’t seem to get called.

Thank you,

Javier

You can use OnAudioFilterRead in your case.
Please check this line in manual.

The filter is inserted in the same order as the MonoBehaviour script is shown in the Inspector.

And try the AudioStreamTrack.Loopback flag. The flag is false by default, you should set true.
This flag affects the line below.
https://github.com/Unity-Technologies/com.unity.webrtc/blob/develop/Runtime/Scripts/AudioCustomFilter.cs#L50

1 Like

Thank you for the incredibly fast response.

I did more testing with OnAudioFilterRead after reading your message, but when OnAudioFilterRead is called, data is always all 0.

I am setting the AudioSource.loop to true before playing it just like in the sample (and also in the editor):

        receivedAudioSource.loop = true;
        receivedAudioSource.Play();

I changed the order of the AudioSource and the script that includes OnAudioFilterRead in the GameObject, but it didn’t change the result. If you have any idea what I might be doing wrong, it would be really helpful.

Thank you

Edit: just in case I wanted to add that even though I am receiveing and listening to the audio, the AudioSource has no AudioClip set. I think that’s probably how the WebRTC library works though.

I think I managed to fix my problem. I found that when I placed the WebRTC plugin’s AudioCustomFilter script component in my GameObject, my other script’s OnAudioFilterRead started working. I don’t know why this was needed, since it doesn’t seem to be mentioned in the documentation and I found it by luck after looking into AudioStreamTrack’s code and the scripts included in the plugin.

1 Like

Mmm, It sounds a bug. WebRTC should make AudioCustomFilter and attach the component to the gameObject.
How can I reproduce the issue?

Here is a screenshot of the GameObject. VRAvatarWebrtcSynchronizer is receiving data from a DataChannel and also from an AudioStreamTrack.

The AudioStreamTrack is set to the AudioSource like this inside MediaStream.OnAddTrack:

_audioSource.SetTrack(audioStreamTrack);
_audioSource.loop = true;
_audioSource.Play();

I tried placing OnAudioFilterRead inside VRAvatarWebrtcSynchronizer or LipSyncHandler, but the data array was always all 0, even though the audio was being received from WebRTC.

After adding AudioCustomFilter like in the screenshot, OnAudioFilterRead data was fixed and I am able to use the data in OVRLipSync.

8259378--1081251--webrtc_audiocustomfilter_scrsh.png

Thank you for sharing the detail.
I assume that the order of components on the inspector window would occur the issue.