I was wondering if it was at all possible to use an external (synced) video feed and gyroscope (e.g. from a wifi connected drone) with ARCore? In particular I was wondering if it was possible to use those as drop in replacements for the default gyroscope and camera on the Android phone itself?
From the basic amount I’ve read on this it seems like AR foundation is entirely dependent on the device’s ARcore installation for most of the low-level tracking logic, but I’m not at all sure where exactly the information processing transfer points are in the API.
E.g. From the AR core docs it says that it can run from pre-recorded video via feeding the raw pre recorded feeds, which I would assume Unity would be reading off of the disk before forwarding that information to ARCore. I think this would imply a fairly opened bi-directionality for inputs between the two systems, and potentially a nice input vector for different live feeds, but I could be entirely wrong if AR core is itself loading of that information on from the disc.
I am also curious in how the gyroscope data is encoded with an MP4? As is mentioned in the docs: Google ARCore XR Plug-in | Google ARCore XR Plugin | 5.0.7