Is it possible to use external camera and gyro feeds with ARCore?

I was wondering if it was at all possible to use an external (synced) video feed and gyroscope (e.g. from a wifi connected drone) with ARCore? In particular I was wondering if it was possible to use those as drop in replacements for the default gyroscope and camera on the Android phone itself?

From the basic amount I’ve read on this it seems like AR foundation is entirely dependent on the device’s ARcore installation for most of the low-level tracking logic, but I’m not at all sure where exactly the information processing transfer points are in the API.

E.g. From the AR core docs it says that it can run from pre-recorded video via feeding the raw pre recorded feeds, which I would assume Unity would be reading off of the disk before forwarding that information to ARCore. I think this would imply a fairly opened bi-directionality for inputs between the two systems, and potentially a nice input vector for different live feeds, but I could be entirely wrong if AR core is itself loading of that information on from the disc.

I am also curious in how the gyroscope data is encoded with an MP4? As is mentioned in the docs: Google ARCore XR Plug-in | Google ARCore XR Plugin | 5.0.7

Google maintains their list of supported ARCore devices at this URL: ARCore supported devices  |  Google for Developers

See Google’s docs for for more information about session recording and playback: 録画と再生の概要  |  ARCore  |  Google for Developers

It is not possible to inject a custom camera feed into ARCore at the application layer. However, note that is possible, but not at all easy, to implement a custom provider for AR Foundation that uses your camera. In this scenario, you are on the hook to rewrite your desired features of ARCore using your camera and your tracking algorithms. We have some documentation of the steps in this process here: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@5.0/manual/arsubsystems/arsubsystems.html#implementing-a-provider

1 Like