Hello All,
i have a few question that i could not solve here.
I just want to make a simple VR Apps that show images taken from my Android camera to a raw images using webcamTextures. here is the step that i follow
- i follow the tutorial shown here:
https://www.youtube.com/watch?v=c6NXkZWXHnc
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
public class textureFromWebcam : MonoBehaviour
{
private bool camAvailable;
WebCamTexture backCam;
private Texture defaultBackground;
public RawImage background;
public AspectRatioFitter aspectRatio;
// Use this for initialization
void Start()
{
defaultBackground = background.texture;
WebCamDevice[] devices = WebCamTexture.devices;
if (devices.Length == 0)
{
Debug.Log("no camera detected");
camAvailable = false;
return;
}
for (int i = 0; i < devices.Length; i++)
{
if (!devices[i].isFrontFacing)
{
backCam = new WebCamTexture(devices[i].name);
}
if (backCam == null)
{
Debug.Log("unable to find backCam");
return;
}
backCam.Play();
background.texture = backCam;
}
}
// Update is called once per frame
void Update()
{
if (!camAvailable)
{
return;
}
}
}
- i made the canvas to be the “world space” so i can look around to see the Raw Images floating
- i try the apps at the editor and it working nice
- i build it to my android, while the playerSetting->XR Setting->Virtual Reality Supported is checked. i use Cardboard SDK
- the apps is running succesfully, but the image from my webcamTexture did not show anything
please does anyone know why this is happening? or does it a bug??
thank you Unity