I’m trying to create a persistent point cloud that also has data for the surface normals for each point as well. On ARCore, there is supposedly a way to individually access each point and get their Pose, which returns their orientation.
What I was wondering is if it is possible to do such a thing in ARFoundation as well? So far the only data I can see and access is the general positions, identifiers and confidence values of points within a point cloud, but not more in-depth data.
As far as I know, there is no way to get a normal of a point from the point cloud, not in ARCore, nor in ARKit.
The link you provided is not related to the point cloud point, but to the point (Anchor) attached to the detected plane.
ARCore does provide “oriented” feature points, but you must perform a raycast against them in order to get the orientation. ARFoundation’s ARRaycastHit provides a full pose (i.e., position and rotation) which will include the feature point’s orientation if available.