Is it possible to access the world mesh during runtime to create MR experiences with colliders over the real world?
I’m interested in building experiences where characters can walk around a sofa for example.
I know with AR Mesh Manager we can generate it, but the Vision Pro is always scanning the world and creating meshes no?