I have a sphere in the centre of my application. When a user touches the screen (iphone) I want to know how many degrees away from the centre sphere it is. So for example if the user touched directly to the right of the sphere it would be "90" or if the user touched directly to the left of the sphere it would be "270". I've searched online but I have no idea how to go about doing this. Any ideas?
Edit: Yey! Problem solved. I used the below code combined with this basic touch code I just wrote:
function Update () {
if ( iPhoneInput.touchCount > 0 ){
for(var touch : iPhoneTouch in iPhoneInput.touches) {
Debug.Log(touch.position);
}
}
}
There are two ways you could approach this, one is to use Vector3.Angle() to get the angle between one vector and another, or you could use Atan2, which is essentially results in the same thing.
Basically you need to find the vector from the screen origin (center of your sphere in screen co-ordinates) and the current mouse position. With this you can find the angle between it and say a vector pointing straight up (in screen space)
This will give you a value of 0 straight up, 90 directly left/right and 180 straight down. To get it into 0-360 instead of 0-180 on both sides, you'll have work out which side the mouse is on and adjust accordingly. However i'd just use the second method below.