I tested my app using Geospatial API near the road where Google Street View is located and found that the horizontal accuracy was well below 10. Since then, in order to make more use of this, photos were taken using a 360-degree camera, and more than 10 photos were registered within about 100m. When I checked after about two weeks, the accuracy was as low as 30 degrees, and the VPS did not seem to be working.
It’s a question closer to AR Core than Unity, but AR Core does not provide a suitable community, so I’m asking this question. I would appreciate it if anyone could answer. Or if you have a community or contact number to ask these questions, you can let me know.
From what i suspecting the vps relies on image segmentation and the 3D geodata. From my experiences it is high quality if its close to a public street (so street view available) and if they have a high quality lidar based model. You can see that in google maps if you can tilt and it is 3d. The geo source is also important so if your city or district provides high quality data about building shapes etc. Since it uses segmentation its important that the system can see different houses, the street shape and direction. I am suspecting the actual color/texture is less important. That way they don’t rely on seasonal changes, snow, or the Icecream truck parking in front of your house.
The 360 images most likely have no influence since they can’t validate whether your geopositioning is correct. Otherwise you could move the street for everyone if you upload a photosphere 100m from the correct position.
So the quality really depends on the place and the scenery around it.
But i am curious in what kind of situation you are. Maybe we can learn more about the workings of the system from that.
What we do is in places where geopositioning is bad but we are able to manually set it - like you did with the 360 spheres - we use persistent cloud anchors as a fallback.