Hi!
I’m able to show the live feed on a RawImage, using WebcamTexture but the byte[ ] conversion using .EncodeToPNG() (or Jpeg) scrambles the image.
The same code works sending the Android’s webcam feed over to the Hololens. (the image doesn’t get scrambled)
Does anyone have an idea why only this specific resolution of 896x504 works when converting it to a byte[ ] or is this a bug?
What i found for this resolution is that is used on the 1st Hololens, in low power mode.
But i’m using the 2nd Hololens…
Expected behavior
Encoding to byte[ ] would show the same image as being rendered on a RawImage.
Actual behavior
Image gets scrambled.
Steps to reproduce
//grab webcam and set it to a WebcamTexture
WebCamDevice[] webcamDevices = WebCamTexture.devices;
for (int i = 0; i < webcamDevices.Length; ++i)
{
print("Webcam available: " + webcamDevices[i].name);
webCamTexture = new WebCamTexture(webcamDevices[i].name, width, height, fps);
if (rawImage.texture == null)
{
rawImage.texture = webCamTexture;
webCamTexture.Play();
break;
}
}
//Using the WebcamTexture, copy color[ ] over to a Texture2D
webcamTexture = new Texture2D(webcamManager.width, webcamManager.height);
if (webcamTexture && webcamManager && webcamManager.webCamTexture)
webcamTexture.SetPixels(webcamManager.webCamTexture.GetPixels());
byte[] byteArray = webcamTexture.EncodeToJPG();
Unity editor version
I tried with 2020.3.14f, 16f and 18f
Mixed Reality Toolkit release version
2.7.2.0