Hello,
How to “3D scan” the interior of a room with AR Core?
As far as I understood, using the Depth API example scenes, the interaction is possible only in “screen space” and is real-time. Or by using cloud points, but there seems to be no way to make a mesh out of those cloud points.
What I want is to let the user go around a room and 3D scan the interior first, then the user would interact with the 3d scanned mesh as a physics collider.
Best.
P.S: TLDR, I would like to recreate something like this but only for physics collision, any clues and help would be appreciated.
ARCore doesn’t provide Meshing. However, you can refer to the Meshing example for ARKit.
https://github.com/Unity-Technologies/arfoundation-samples#meshing
Does that mean ARFoundation supports this for android devices out of the box?
No. Only iOS does.
With regards to cloud points, you could potentially use each point as a vertex, and create your own mesh from that. As far as I know that would be the only way to create an actual ‘mesh’ of your environment on Android (but takes a lot of work & processing power).
Using ARPlanes would probably be a better way to go here. They provide physics by default.
If the ‘single mesh’ is a requirement, you could potentially combine all of the colliders created by ARPlanes (and disable the original ARPlane-Objects).
1 Like