I am building a multiplayer AR application that currently is supposed to localize users in a real-world space. When we initially do the localization method, the objects are directly anchored to the phone of each user. We notice however, as the experience continues, noticeable drift from the initial placement of the marker used to localize. Here are some things that seem to affect it:
-Movement speed in the real world (walking, jogging, sprinting).
-Looking at other features other than the floor (ceiling, walls, cluttered areas like a desk).
-Keeping ARPlaneManager enabled post-placement of your marker.
-30 vs 60 frame forced targetFrameRate.
Are there any other known sources of plane drift that I haven’t mentioned? Should some of these events not affect drift as much as they do? In most cases, the drift is only a few inches, but it has been more drastic in the past, sometimes even flying off screen.
Any information is appreciated, thank you
Yep those are some of the big ones! A couple others would be:
-
Lighting conditions in the environment (too dark or rapidly changing = poor tracking)
-
Characteristics of the environment (not many feature points = poor tracking)
-
Lens cleanliness (blurry lens due to dirt = poor tracking)
-
Hardware specs (ie, ARKit device with LiDAR = better tracking than non-LiDAR)
-
Platform (you should expect variance in drift across platforms)
How much of a difference did 60fps make? And was this on just ios or also android (since android arcore recommended fps is 30 and arkit is 60)
60fps seems to make it more consistent, but this could be a placebo-type effect. The overall performance is better obviously considering we are rendering a relatively basic scene that can handle 60fps consistently, but I am unsure how much this affects plane tracking.
If you want to do this yourself, be sure to disable the ‘Match Framerate’ option of your AR Session object for Android as that setting forces Android platforms to 30fps. iOS runs 60 by default so you don’t need to change it there.
1 Like