Hi guys,
I am curious if XR Interactable Toolkit can also work with depth on mobile devices using ARFoundation.
Basically, I am trying to figure out if it is possible to drag a 3D model onto the depth considering the AR Translation Interactable.
When I play with plans, so with AR Plane Manager, the Translation component works perfectly, but when I use depth, with the AR Occlusion Manager, I can’t move the model.
Do you have any ideas? I really think the AR Translation Interactable is not enabled for translation on depth.
Share the code on how you use depth to place the object. It is definitely possible, but I am unsure if this can be done without coding it yourself
The code I am using, for now, is just to place the object on the depth.
I used ARFoundation’s Raycast method (AR Raycast Manager component), considering the depth. In the unity project, I put the AR Occlusion Manager component (for calculating depth) into the AR Camera.
Basically, I assigned XR Interaction Toolkit components to my object to have directly available the gestures of AR Rotation Interactable, AR Selection Interactable, AR Scale Interactable to rotate, select and scale the reference object.
Therefore, I have also tried AR Translation Interactable (also related to the XR Interaction Toolkit package) and it does not seem to work. I guess then that I have to insert this functionality custom via script.
Do you have any idea how I could implement object drag on depth?
The code is this:
public GameObject game;
public GameObject indicator;
GameObject spawnedObject;
AROcclusionManager occlusionManager;
ARRaycastManager raycastManager;
List<ARRaycastHit> hits = new List<ARRaycastHit>();
bool spawned = false;
// Start is called before the first frame update
void Start()
{
// Reference to AROcclusionManager that should be added to the AR Camera
// game object that contains the Camera and ARCameraBackground components.
occlusionManager = GetComponentInChildren<AROcclusionManager>();
raycastManager = GetComponent<ARRaycastManager>();
}
private void Update()
{
var ray = new Vector2(Screen.width / 2, Screen.height / 2);
if (raycastManager.Raycast(ray, hits, TrackableType.Depth))
{
Pose hitPose = hits[0].pose;
if (!indicator.activeInHierarchy && !spawned)
{
indicator.SetActive(true);
}
indicator.transform.position = new Vector3 (hitPose.position.x, hitPose.position.y + (float)0.03, hitPose.position.z);
if (Input.touchCount == 1)
{
Touch touch = Input.GetTouch(0);
if (touch.phase == TouchPhase.Began && !spawned)
{
spawnedObject = Instantiate(game, indicator.transform.position, indicator.transform.rotation);
spawned = true;
indicator.SetActive(false);
}
}
}
}
TrackableType.Depth is raycasting against a point cloud, and points in space can be very precise to try and hit, I do notice that in Line 287 of ARPointCloudManager it’s checking an angle, so perhaps you can try and change this method or implement your own to have a more generous angle check.
Alternatively, you can look into sampling the depth texture coming from the camera, and then take that pixel value and transform it into a number you can use to place objects in your scene.
Yes, is true! I need to try to implement my own method because is impossible change the method in AR Point Cloud Manager.
I will try, thanks for the tip.