Disclaimer: I am new to Unity and only know a little about it. I have created a basic scene where I have an ARSession, ARSession Origin with ARCamera Tagged as Main Camera. I also have a Cube in the scene in front of the camera as shown:
Question: What I wish to do is simply have the Live Raw camera feed as seen by ARCore through my phones camera and save them into a matrix that I can later use with Opencv. Also Is it Possible to add that live feed to the Cube as a texture?
I have the basic code and libraries as below:
namespace OpenCvSharp
{
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using OpenCvSharp;
using UnityEngine.XR.ARFoundation;
using UnityEngine.Experimental.XR;
public class Sourcing_Camera_Feed : MonoBehaviour
{
static WebCamTexture Cam;
private ARSessionOrigin arOrigin;
private ARSession ARSession;
// Start is called before the first frame update
void Start()
{
}
// Update is called once per frame
void Update()
{
if (Cam == null)
Cam = new WebCamTexture();
GetComponent<Renderer>().material.mainTexture = Cam;
if (!Cam.isPlaying)
Cam.Play();
}
}
}
In the above Program, I was able to get live feed as texture in Unity through my webcam, but when I try it in ARCore on my phone, I see the 1st frame only like so:
I want a live raw feed instead of 1st frame. Can anyone suggest a code?
Thanks in advance.