Precision decimal data type

There’s been a couple of links pointing to how Unity supports all the .NET / Mono data types, but it seems that precision data types, such as the double and decimal data types are implicitly converted to float.

I found this out the hard way after spending hours debugging something that turned out to be a decimal point precision issue.

As a sanity check, I tried:

var p:System.Decimal = 12.3456789;
Debug.Log(p.ToString());
Debug.Log(p);

Believe it or not, both will output: 12.34568 , lopping off the digits after rounding
and NOT 12.3456789

I guess my question is, how do you properly deal with decimal point precision more than the float type, in Unity?

You need a suffix. Upper or lowercase should be fine, but UnityScript apparently requires the D to be capitalized, and won’t accept either form of M. No idea why.

var p:System.Decimal = 12.3456789; // float crammed into a decimal
var p = 12.3456789m; // decimal
var p = 12.3456789D; // double

Stop using UnityScript. The C# compiler would have told you what the problem was, and you wouldn’t have had to wait for an answer.

I don’t think this is an example representing decimal point precision problem. I think this is just string truncation acting as expected. If your number has too many characters to be printed out, its shown shortened. Putting a variable into Log() method implies that it should be converted to string and that is why you’ll get the same result with .ToString() as you do without it.