Trying out the Object Tracking but need some guidance - new to iOS development.
I have my prefab being instantiated into the scene when the world onject is detected however the object satys in a fixed position. Is it possible to anchor it to the world object as well?
What I am trying to do is have a cup in the real world that is scanned by the phone and have a 3d model of the same cup appear in the scene. Ideally, I would like to be able to scale the virtial cup the real world cup and have the virtual cup constantly updating its position to match that of the real world cup.
Is this possible? Any tips, links or suggestions on how to achieve this? Also, if there is a third party asset that does that even better.
For multiple objects, you’ll want to subscribe to the ARTrackedObjectManager’s trackedObjectsChanged event, then choose which prefab you want to spawn based on which object was added. This event passes you an ARTrackedObject, then you should Instantiate your textured mesh prefab as a child of the ARTrackedObject GameObject to ensure that the mesh position is updated via tracking.
What do you mean by “Xr world coordinate”? You cannot move Unity’s world space, but you can move the XR Origin’s GameObject. More info about XROrigin here: Device tracking | AR Foundation | 6.0.3