Unity AR Foundation TrackedImage Rotation Issue Help

Unity AR Foundation TrackedImage Rotation Issue Need Help

I am developing an Android application using AR Foundation in Unity. I am generating objects through TrackedImage, but the rotation of the objects varies depending on the initial position of the device, even if the same rotation values are applied.

For example, if I launch the application while facing north, south, east, or west, and then recognize an image, the object’s rotation appears as intended. However, if I launch the application while facing a diagonal direction, the object’s rotation is slightly off.

The red rectangle represents the desired direction, and when I launch the application facing one of the cardinal directions (north, south, east, or west), it appears like the red rectangle. However, when I launch the application facing any other direction, the issue arises where the object appears like the blue rectangle.

Below is the code I wrote, where objs[0] is the object that is aligned with the +z direction at the initial execution.

currentForward is used to capture the current forward direction.

Objects are generated in the ±x, ±y, and ±z directions, and using a raycast from the camera, I check which direction the user is currently facing.

When I use trackedImage.transform.position for spawnedObject.transform.position, the result is not what I want, and the outcome changes depending on the direction the app is launched. Therefore, I am currently hardcoding by adding specific values to the camera’s position.

Finally, I add snappedAngle + 90 to the rotation because the model, exported from Blender, has a -90 degree rotation, so I add 90 to compensate for that.

Is there a solution to this problem?

        if (trackedImage.trackingState == TrackingState.Tracking)
        {
            Vector3 rotationAngles = Camera.main.transform.rotation.eulerAngles;

            trackedPosition = trackedImage.transform.position;

            GameObject spawnedObject = Instantiate(arObjectPrefab[0]);

            Vector3 directionToB = trackedImage.transform.position - objs[0].transform.position;
            directionToB.Normalize(); 

            float angle = Vector3.Angle(forwardA, directionToB);

            Vector3 cross = Vector3.Cross(forwardA, directionToB);
            if (cross.y < 0)
            {
                angle = -angle; 
            }

            if (angle < 0)
            {
                angle += 360;
            }

            float snappedAngle = Mathf.Round(angle / 90) * 90;

            float cameraAngle;

            if (rotationAngles.y < 0)
            {
                cameraAngle = Mathf.Round(rotationAngles.y % -90);
            }
            else
            {
                cameraAngle = Mathf.Round(rotationAngles.y % 90);
            }

            switch (currentForward)
            {
                case "+x":
                    spawnedObject.transform.position = Camera.main.transform.position + GlobalVariable.Instance.room_offset_x;
                    break;
                case "-x":
                    spawnedObject.transform.position = Camera.main.transform.position + GlobalVariable.Instance.room_offset_nx;
                    break;
                case "+z":
                    spawnedObject.transform.position = Camera.main.transform.position + GlobalVariable.Instance.room_offset_z;
                    break;
                case "-z":
                    spawnedObject.transform.position = Camera.main.transform.position + GlobalVariable.Instance.room_offset_nz;
                    break;
            }

            spawnedObject.transform.rotation = Quaternion.Euler(0f, snappedAngle + 90, 0f);
        }

The code related to obj[] is as follows.

private void Start()
{
    GameObject obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.1f, 0.5f, 0.5f);
    obj.name = "+x";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);

    obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.1f, 0.5f, 0.5f);
    obj.name = "-x";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);

    obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.5f, 0.1f, 0.5f);
    obj.name = "+y";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);

    obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.5f, 0.1f, 0.5f);
    obj.name = "-y";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);

    obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.5f, 0.5f, 0.1f);
    obj.name = "+z";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);

    obj = GameObject.CreatePrimitive(PrimitiveType.Cube);
    obj.transform.localScale = new Vector3(0.5f, 0.5f, 0.1f);
    obj.name = "-z";
    obj.GetComponent<Renderer>().enabled = false;
    objs.Add(obj);
}

private void Update()
{
    foreach (GameObject i in objs)
    {
        switch (i.name)
        {
            case "+x":
                i.transform.position = Camera.main.transform.position + new Vector3(0.25f, 0, 0);
                break;
            case "-x":
                i.transform.position = Camera.main.transform.position + new Vector3(-0.25f, 0, 0);
                break;
            case "+y":
                i.transform.position = Camera.main.transform.position + new Vector3(0, 0.25f, 0);
                break;
            case "-y":
                i.transform.position = Camera.main.transform.position + new Vector3(0, -0.25f, 0);
                break;
            case "+z":
                i.transform.position = Camera.main.transform.position + new Vector3(0, 0, 0.25f);
                break;
            case "-z":
                i.transform.position = Camera.main.transform.position + new Vector3(0, 0, -0.25f);
                break;
        }
    }
}

Instead of using Camera.main.transform.rotation.eulerAngles, try using trackedImage.transform.rotation.eulerAngles for a more consistent transformation based on the tracked image’s position in world space.

I agree with this recommendation by @warners