I have saved an “.ARobject” point data file from a scan of a real-world object using my iPhone and would like some pointers on how to use this within Unity.
I have imported a script: “GenerateObjectAnchor.cs” from a demo scene, however this produces errors.
I have both AR Foundation and ARKit installed.
The aim is to be able to attach an object or plane to a real world 3D object rather than to a 2D target image, when viewed with the device camera.
Thank you - additionally, unless I’m missing something, it would be handy to have a visual representation of the scan point cloud in the viewport, in order to make object placement more accurate.
So far it seems to be a matter of trial and error to get objects / planes to line up.
Thank you - firstly, I have managed to locate an Object Tracking demo scene file from ARFoundation that allows me to achieve the intended gaol of having an AR object track to a scanned real world 3D object.
However, is it possible to display the “.arobject” in theUnity Scene Editor,since currently it only exists as an asset within an “ARTracked Object Manager”?.
Being able to actually see the scanned object within the scene editor would make alignment of virtual elements more accurate, whereas at the moment it is very much a case of hit and miss.