Touch Orbit Issue

I have already posted this question in the Unity Answers, in “Scripting” forum, and no feedback at all, I do not intend to spam, but I really need your assisstance.

Hi, I’ve got a weird bug in my project.

A little background, I have a joystick for the object movement on the left side of the screen, and I want the player to be able to rotate the camera (mouse/touch orbit) around the target using only the right side of the screen, so there would be no collision with the left joystick and the calculation of the position.

I took the “Mouse Orbit” script, and checked for touch versions on the internet and I rewrote the script so that it would fit my needs.

The code works great when I test in Unity Remote, but when I build it, it somewhat seems that the y position of the Touch Orbit script is locked, and I’m unable to change the height position of the camera.

What I’ve also did in order to check in my build in my android device, was to see by a GUIText object the exact values of y position in the deltaPosition.y and it did change, but it had no effect on the camera from some reason.

Imagine yourself that the camera is orbiting only by the x and nothing on the y.

Here is the code:

using UnityEngine; 

using System.Collections;

 

[AddComponentMenu("Camera-Control/Mouse Orbit with zoom")]

public class MouseOrbit : MonoBehaviour

{

 

    public Transform target;

    public float distance = 5.0f;

    public float xSpeed = 120.0f;

    public float ySpeed = 120.0f;

 

    public float yMinLimit = -20f;

    public float yMaxLimit = 80f;

 

    public float distanceMin = .5f;

    public float distanceMax = 15f;

 

    float x = 0.0f;

    float y = 0.0f;

 

    // Use this for initialization

    void Start()

    {

        Vector3 angles = transform.eulerAngles;

        x = angles.y;

        y = angles.x;

 

        // Make the rigid body not change rotation

        if (rigidbody)

            rigidbody.freezeRotation = true;

    }

 

    void Update()

    {

        if (target  Input.touchCount == 1  Input.GetTouch(0).position.x > Screen.width / 2  Input.GetTouch(0).phase == TouchPhase.Moved) //Just orbit touch without movement

        {

            Debug.Log("Orbiting! 1 touch");

            Orbit(Input.GetTouch(0));

        }

        else if (Input.touchCount == 2)

        {

            if (Input.GetTouch(0).position.x > Screen.width / 2  Input.GetTouch(0).phase == TouchPhase.Moved)

                Orbit(Input.GetTouch(0)); //Movement was touched second

            else if (Input.GetTouch(1).position.x > Screen.width / 2  Input.GetTouch(1).phase == TouchPhase.Moved)

                Orbit(Input.GetTouch(1)); //Movement was touched first

        }

 

    }

 

    void Orbit(Touch touch)

    {

        x += touch.deltaPosition.x * xSpeed * 0.02f /* * distance*/;

        y -= touch.deltaPosition.y * ySpeed * 0.02f /* * distance*/;

 

        y = ClampAngle(y, yMinLimit, yMaxLimit);

 

        Quaternion rotation = Quaternion.Euler(y, x, 0);

 

        //distance = Mathf.Clamp(distance - Input.GetAxis("Mouse ScrollWheel") * 5, distanceMin, distanceMax);

 

        RaycastHit hit;

        if (Physics.Linecast(target.position, transform.position, out hit))

        {

            distance -= hit.distance;

        }

        Vector3 negDistance = new Vector3(0.0f, 0.0f, -distance);

        Vector3 position = rotation * negDistance + target.position;

 

        transform.rotation = rotation;

        transform.position = position;

    }

 

    public static float ClampAngle(float angle, float min, float max)

    {

        if (angle < -360F)

            angle += 360F;

        if (angle > 360F)

            angle -= 360F;

        return Mathf.Clamp(angle, min, max);

    }

 

 

}

I’ll explain the “if” lines, in the update function, the first one checks that if there’s only 1 touch, on the right side of screen, use it in order to orbit the camera.
the other if’s, are for understanding when there’s 2 touches, but one of the touches is starting on the left side while controllig the object, and then the second one on the right, or the other way around, in both ways, I send the touch that is in the right side of the screen.

I would suggest setting up the scene camera so it has an invisible/empty parent, and is aimed to be looking at that parent. Then the orbit relationship is driven by the orientation of the parent. You might want to look at the dual joystick setup, instead of the mouse orbit setup. Then on each Update(), add the right joystick’s position to the euler angles of the camera’s parent’s orientation, not the orientation of the camera itself. I just suggest this because the dual joystick example already handles all the multi-touch issues and is pretty easy to use.

To support debugging, I’ve also modified the stock scripts to support mousing or keyboard controls so that you don’t need to rely on the touch screen joysticks.

First, thank you for commenting, I also have a dual joysticks mode that works great, I made this option for players that doesn’t want 2 joysticks, but only one, and a free camera movement from their finger.

My player is actually derived from the Dual joysticks prefab, and I added the touch orbit script to the “pivot camera” object, it just that I enabled/disable the script according to the player preferences (whether to use a joystick or not). Also, this is working good in Unity Remote, do you have any idea why it doesn’t work in my device? could it be a bug?