ARKit and WebCamTexture

Hi,

Is it possible to use WebCamTexture and ARKit together?

I have a script that creates a WebCamTexture that I am trying to use to take a picture of a scene. The script draws texture OnGUI() method. I just attach this script to an empty GameObject. When trying the app on an iPad I see the Unity screen and tthen blank. I see error on XCode that says something about nsinvalidargumentexception avcapturedevice.

Any ideas?

Thanks

avcapturedevice unsupported frame duration

FYI, the whole point is to be able to take a picture of my scene.

Up

Hi Dmitry-K,

When the ARSession is running, it takes over the device’s camera. Nothing else can use the device’s camera while the ARSession is running.

My responses in these these other two threads should give you more insight into solving your problem.

Todd

Good Afternoon,

I have a similar problem but different.
In my case, I’m trying to access the front camera feed using Webcam Texture in one scene while in other scene, I use the ARKit normally.
The ARKit works normal but on the other scene, the texture that I get is blank.
Do I need to anything special to enable the camera?