Unity + Niantic ARDK Aumeted Reality: Objects moving opposite to finger direction when camera is in front. need assistance drag and drop funcionality

I’m working with Niantic’s ARDK, and the company asked me to place three objects in the real world so that users can drag and drop them anywhere. I have successfully managed to place the objects, hold them, and drag them correctly when the camera is positioned behind the objects. However, there is a problem when the user positions their camera phone in front of the objects or on the left side, as the objects move in the opposite direction compared to the finger.

This is my code:

private void HeldObject(Touch touch) { Vector2 _touchPosition = touch.position;  Ray ray = Camera.main.ScreenPointToRay(touch.position);
    RaycastHit hit;

    if (Physics.Raycast(ray, out hit))
    {
        if (hit.collider.tag == "apple")
        {
            dragObject = hit.transform;
            distanceToCamera = dragObject.position.z - Camera.main.transform.position.z;
            relativePositionTouch = new Vector3(_touchPosition.x,
_touchPosition.y, distanceToCamera);
            relativePositionTouch = Camera.main.ScreenToWorldPoint(relativePositionTouch);
            cameraToObjectOffset = dragObject.position - relativePositionTouch;
            isHeld = true;
            Debug.Log($"Esta es la fase del touch: {touch.phase}");
        }
    } }

private void MovedObjet(Touch touch) {
    Vector2 _touchPosition = touch.position;
    relativePositionTouch = new Vector3(_touchPosition.x,
_touchPosition.y, distanceToCamera);
    relativePositionTouch = Camera.main.ScreenToWorldPoint(relativePositionTouch);
    dragObject.position = relativePositionTouch + cameraToObjectOffset; }

@sergio17121 - I apologize for the delay in response, I seem to be the only active moderator these days, and I am trying to keep up, but I have been very busy lately.

The issue you’re facing likely comes from the way Unity3D handles the ScreenPointToRay and ScreenToWorldPoint methods. These methods use screen coordinates, where the origin (0,0) is at the bottom left of the screen. However, touch input uses coordinates where the origin is at the top left of the screen. So, when you’re moving the camera, it’s likely the object is getting moved in the opposite direction because of this discrepancy.

Here’s a potential solution: transform the touch position to match Unity’s screen coordinate system before using it.

In your HeldObject function, you should modify the touch position as follows:

Vector2 _touchPosition = new Vector2(touch.position.x, Screen.height - touch.position.y);

And in your MovedObject function, do the same:

Vector2 _touchPosition = new Vector2(touch.position.x, Screen.height - touch.position.y);

This should adjust the touch input to match Unity’s coordinate system, which might resolve the issue you’re experiencing.

Also, remember to validate the tag name case. It must match exactly with the name given in Unity, as they are case-sensitive. If the issue persists, please share more details about your implementation.