Hi all,
Why does unity use float instead of double? I’m getting just 112.123 (longitude), about 3-4 decimals. That’s real bad. I lost many decimals/GPS accuracy just because of the data type Unity’s chosen.
For iOS, in iPhone_Sensors.mm, I’ve found these lines.
Unity casts a double into a float.
UnitySetLastLocation(double timestamp,
float latitude,
float longitude,
float altitude,
float horizontalAccuracy,
float verticalAccuracy);
- (void)locationManager:(CLLocationManager *)manager
didUpdateToLocation:(CLLocation *)newLocation
fromLocation:(CLLocation *)oldLocation
{
gLocationServiceStatus.locationStatus = kLocationServiceRunning;
UnitySetLastLocation([newLocation.timestamp timeIntervalSince1970],
newLocation.coordinate.latitude,
newLocation.coordinate.longitude,
newLocation.altitude,
newLocation.horizontalAccuracy,
newLocation.verticalAccuracy);
}
I believe that’s very very very easy to fix. Right?
Can anyone or the Unity Team have a look at the issue?
Thanks!