I have a game running on Meta Quest 2/Pro and a companion app running on Android smartphone for video chat. Signaling works fine and streaming is established between devices. The stream from Android can be seen in VR, as well as the capture video taken on the Quest. But on the Android side the received stream appears as a black image.
I use RTCPeerConnection.GetStats to check the streams for sent/received bytes and packets. The amount of data is very low for the stream appearing as a black image. When I run the game on Windows in the Unity Editor, the stream properly shows as the captured video seen inside VR and the data amount is 2 times higher for bytes and 3 times higher for packets.
My guess is that there is some problem before or during video encoding on the Quest. The cause for this might be linked to lack of resources (memory, available threads, I dont know). When I use the same code from my game in a small sample scene, streaming works (but since nothing is happening in the scene the same image repeats and the sent data amount is also quite low).
I’m not getting any error messages. What would be helpful if the WebRTC encoder could communicate some log info and errors, if there are any occuring of course.
I previously asked for help regarding this problem in the WebRTC FAQ thread as well.
I forgot to mention this, but it seemingly occurs with all codecs. I tried various filtering, also forcing MediaCodec implementations over internal (which strangely didn’t work), but it occurs regardless of codec. Examples are VP8, VP9, AV1, H264.
It works when using my code in a small sample scene.
Yes, that works. I tried that and reported back in this answer . Note that I said that “the last step fails” which was a conclusion I drew from no video showing but I wasn’t looking at the logs at that moment. I later edited to report in the same post that the steps indeed complete; but that just means there must be a different reason for the stream appearing as a black image.
Note that in the next post I report that Quest 2 to Quest Pro also doesn’t work, in this case both show black image remote streams. So it has nothing to do with if its server/host, offer or answer.
The condition for this bug occuring is: (a) within the context of my game and (b) running on Quest. When run in Windows, or using the same WebRTC integration in a small sample scene, the bug does not occur.
I made a sample scene for gtk2k, but the bug didn’t occur in it. He also confirmed to me that it ran fine on his end. He also made an example of his own that works and I have talked to another user who has it working on his Quest.
So I cannot replicate the issue outside my own game. Not sure what you can find with your own test scenes but thanks for checking anyway.
For context: I am streaming from the Quest Pro to PC using the Unity render streaming. On the quest side, I have a test scene that streams from an additional camera. On the PC side, I run the webapp sample receiver.
The issue I faced is that the PC side, it connects to the quest successfully. However the decoded video is all black. And the bitrates shows around 80kbit/s, which is way too small.
I found the fix to my issue today. The “Low Overhead Mode (GLES)” (Yes, I am using GLES 3) needs to be disabled. Hope this helps.
On build .51 of my project, render streaming was working great. Sending video from Quest, to the web server, and viewing it from any browser. But, now I’m on build .57 of my project, I haven’t made any changes to the Render Streaming, and I’m getting only audio transmitted. No longer video. Any ideas how I can troubleshoot this?
Edit: I discovered the thing that changed, is I’m no longer doing a Developer Build. I switched back to developer build, and the video is streamed as expected. So, the question is, how do I get the stream working, without a Developer Build?
I have the same issue, although I am using Pico neo 3 pro, so I dont have the “Low Overhead Mode (GLES)” in my Pico plugin settings. Still searching for a fix.
I have since given up on using WebRTC in my game so I can’t check or act upon these findings:
But still wanted to say thanks for sharing. And I can confirm that I’m using GLES3 with low overhead mode as well. Not sure if I tried dev mode at the time but I haven’t used it recently so that might also have been a reason.
Anyway, even after removing the Unity WebRTC package I still have WebRTC related console messages, which are coming from the Meta SDK it seems. Not sure what the SDK is using WebRTC for, but it’s not providing any WebRTC functionality for us devs as far as I know.
I don’t think I will return to the Unity WebRTC package for now, as it isn’t production ready.
I’m experiencing the same issue. I can successfully share audio via an audio player, but the headset’s video stream does not appear on the receiving end. However, when I run the game in Unity Play mode, everything works fine, but not when running on the Quest headset.
Has anyone found a solution or any insights into why this might be happening?