Polar (spherical) coordinates to xyz, and vice versa.

Hello everybody!

I’ve been working at this math for two days now, with no luck. I’m writing a program that puts markers onto a sphere gameObject according to latitude and longitude positions–and there’s somewhere I’ve gone wrong. Converting XYZ to lat and longitude works VERY approximately (it gets the the longitude correct, but latitude is off by about 20 degrees) and lat and longitude to XYZ, not at all. There are really three functions I’m using here. I’ve had a look at the wikipedia and stack overflow pages on this topic, and I’m not sure what I’ve done wrong…perhaps someone could point me in the right direction? I really feel lost with this stuff. Never had a class on it. Thanks so much for your time!

Simon

XYZ TO LAT/LONGITUDE:
This function converts raycastHits to Lat and longitude data. (When the player clicks on the globe in view, my program must generate this information, irrespective of rotation).

function determineGISCoordinates()
{
	//determines attributes of the earth-sphere.
	var earthPosition = earth.position;
	//THIS IS BASED ON THE EARTH TEXTURE WE'RE USING. when the earth stands at this rotation, the 0,0 point of latitude and longitude sit directly in front of the camera.
	var originRotation = Vector3(0.00,105.00,0.00);
	var currentRotation = earth.transform.localEulerAngles;
	var rotationOffset = currentRotation;
    rotationOffset.x -= originRotation.x;
	rotationOffset.y -= originRotation.y;
	rotationOffset.z -= originRotation.z;
	//get the point of the raycasthit as though the circle were at the origin, and determine the radial distance.
	var localSurfacePointVector = myNewPointer.transform.position-earthPosition;
	var radialDistance = Mathf.Sqrt(Mathf.Pow(localSurfacePointVector.x, 2) + Mathf.Pow(localSurfacePointVector.y,2) + Mathf.Pow(localSurfacePointVector.z, 2));
    //get degree positions on sphere, irrespective of rotation or the lat/long system.
    var lat = Mathf.Rad2Deg * (Mathf.Acos(localSurfacePointVector.y / radialDistance)); //theta
    var lon = Mathf.Rad2Deg * (Mathf.Atan(localSurfacePointVector.x / localSurfacePointVector.z)); //phi
    //apply lat/long as though the sphere were standing still at the origin rotation. Negative numbers mean south or west. i have to do this because when i raycast, without this it's only detecting the north pole as 65 degrees...so i've got to re-scale  the numbers to fit lat and longitude, which go up to 90 and 180. i don't know why this happens. perhaps because the sphere the earth is on has a scale of approximately 2?
    lat = ((lat-90)/65) * 90 * -1;
    lon = ((lon)/65) * 90 * -1;
    //adjust for current rotation (only works for longitude right now).
    lon+= rotationOffset.y;
    if(lon > 180) lon= -1* (180-(lon%180));
    if(lat > 90) lat= -1* (90-(lat%90));
}

LAT/LONGITUDE TO XYZ: This function takes Minutes and second values and converts them to XYZ coordinates on the same sphere as previously mentioned. Then it sends them to a function that places them. It takes the array infoArray, and turns it into longitude / latitude numbers.

function determinePosition(infoArray : Array)
{
	var positionToPlaceMarker : Vector3;
        //convert infoArray to lat and longitude data.
        var latitude = infoArray[0] + infoArray[1]/60;
	if(infoArray[2] == "S") latitude *= -1;
	var longitude = infoArray[3] + infoArray[4]/60;
	if(infoArray[5] == "W") latitude *= -1;
        var earthRadius = earth.collider.radius * earth.transform.localScale.x;
	//because the earth isn't at the origin, we have to add the offset position data.
    //I SUSPECT THESE THREE EQUATIONS ARE WHERE THE PROBLEM IS.
    positionToPlaceMarker.z = earthRadius * Mathf.Cos(latitude) * Mathf.Cos(longitude);
	positionToPlaceMarker.y = earthRadius * -1 * Mathf.Sin(latitude);
	positionToPlaceMarker.x = earthRadius * Mathf.Cos(latitude) * Mathf.Sin(longitude);
	positionToPlaceMarker=positionToPlaceMarker + GameObject.Find("earth").transform.position;
	//sends the data to the function that comes next.
	GameObject.Find("Main Camera").GetComponent("earthGISRaycaster").placeMarker(positionToPlaceMarker);
}

THE PLACING FUNCTION:

function placeMarker(location : Vector3)
{
	//print("Placed a marker!");
	myNewPointer = Instantiate(pointerIcon, location, Quaternion.identity);
	generateTextureColor(myNewPointer); //assigns a random color to the object. assigns the shader to specular.
	myNewPointer.transform.LookAt(earth);
	myNewPointer.transform.Rotate(0,90,0);
	myNewPointer.parent = earth;
	determineGISCoordinates(); // calls the function above, that translates these XYZ coordinates into lat and longitude data. As i've said, this works very very approximately. For example, the correct coordinates for Melbourne, Australia are approximately 30 degrees north of where they should be, somewhere in the australian desert :).
	myNewPointer.gameObject.layer = 2;
}

I should note this is not, strictly speaking, a unity question at all, but a general math programming question! I got a little carried away working on an answer for this one, though, because I found it interesting, so here’s your answer anyway. :slight_smile:

There’s clearly some errors in your math there, more than I care to go through and try to straighten out, so I’m just going to give you, for reference, a pair of working functions that convert between the two.

A note on trig functions: you usually want to use Atan2 for this kind of thing, as it accepts x and y as separate values rather than a single x/y value. This allows it to return a full and correct 360-degree range; doing the division before passing to Atan loses the sign data of each component and will just return a 180-degree range. For example, Atan can’t differentiate between (-2,2) and (2,-2), because both when divided give just -1, but they’re definitely different angles, and Atan2 correctly handles these cases, saving you the trouble of sorting it out yourself!

Threw this example together, it doesn’t do everything you need but it should give you a working reference for basic conversion between polar and cartesian coordinates. I’m using Vector2s to represent the polar coordinates, x being latitude and y longitude. For the conversion back from polar, the easiest, and likely most efficient, approach is to use unity’s built-in vector rotation methods, which is what I did.

function CartesianToPolar(point:Vector3):Vector2
{
    var polar:Vector2;

    //calc longitude
    polar.y = Mathf.Atan2(point.x,point.z);

    //this is easier to write and read than sqrt(pow(x,2), pow(y,2))!
    var xzLen = Vector2(point.x,point.z).magnitude; 
    //atan2 does the magic
    polar.x = Mathf.Atan2(-point.y,xzLen);

    //convert to deg
    polar *= Mathf.Rad2Deg;

    return polar;
}


function PolarToCartesian(polar:Vector2):Vector3
{

    //an origin vector, representing lat,lon of 0,0. 

    var origin=Vector3(0,0,1);
    //build a quaternion using euler angles for lat,lon
    var rotation = Quaternion.Euler(polar.x,polar.y,0);
    //transform our reference vector by the rotation. Easy-peasy!
    var point:Vector3=rotation*origin;

    return point;
}

These are written to assume (0,0,1) is latitude and longitude of 0; if you need to change this, you’ll need to change the origin vector in PolarToCartesian, and then rekajigger the order and sign of x and y terms when calling Atan2 to calculate longitude to correspond.

Hope this is helpful!