How to simultaneously render multiple (web)cam streams in Android

Hello everyone,
I’m trying to render as 2D textures two cameras of my android device simultaneously on two different raw images.
Unfortunately, seems that the provided WebCamTexture API allows only to play one camera per time:

WebCamDevice[] devices = WebCamTexture.devices;
rightCam = new WebCamTexture(devices[0].name, Screen.width, Screen.height)
leftCam = new WebCamTexture(devices[1].name, Screen.width, Screen.height);;;

Namely, the rightCam becomes inactive ( rightCam.isPlaying = false) as soon as the leftCam becomes active and viceversa.

Are there some workarounds or some other plugins / SDK that I can use to let this things work? I’ve found nothing online and on the forum. It seems that other people had similar issues but still not found any solution.

Thank you for any suggestion/help!

solved here GitHub - franckies/android-multicamera-unity: Access more than one android camera at the same time from Unity through Android JNI.