Hello everyone !
I’m getting a weird behavior using ARFoundation libraries.
I use the same project to build my project for both Android (ARCore / OpenGLES) and iOS (ARKit / Metal) platforms.
Apart from minor lighting differences between the two OS (different lighting scenes), we should expect the same result while using an augmented reality framework.
On Android, it is fine.
But when I turn on AR mode on iOS (and only on iOS), the background is torn and AR objects are following the device movements (like a boat / jelly movement effect).
Here is a video illustrating my issue: ARfoundation iOS Issue - YouTube
I suppose that the real-time video stream being not processed properly causes the tracking being temporally broken… But these should be directly handled by the AR libs, am I right ?
My Application target frame rate is set to 60.
On a 5th gen iPad, we can slow down to 12 fps… but with more powerful devices (> 30fps), we do have the same issue ! So performance doesn’t seem to be the only factor here.
I tried to change update types in TrackedPoseDriver (script attached to AR Camera), but nothing solved my problem.
Has anyone ever encountered this issue, either with ARFoundation libs or with ARKit integration only ?
Thanks in advance.