RaycastHit.point.y is always wrong... this is bizzare!

First- a preamble as to what I’m doing: I’m firing a ray from my player’s position + vector3.up so I can determine the height of the ground:

RaycastHit hitInfo = new RaycastHit();
if(Physics.Raycast(transform.position + Vector3.up, Vector3.down, out hitInfo) && hitInfo.collider.tag != "Player"){
	Debug.Log(hitInfo.point.ToString() + " vs (" + hitInfo.point.x + ", " + hitInfo.point.y + ", " + hitInfo.point.z + ")");

My debug output is as follows:
(-18.9, 0.0, 1.3) vs (-18.91656, 4.034199E-07, 1.331655)

obviously, Vector3.ToString() rounds the floats to the nearest 2 decimal places- but look at the Y value!! the ground is definitely at 0.0, so the ToString() output is correct. Why would the float component be wrong?? I’ve actually tried parsing the ToString output and converting it to a float- and it technically works, but it’s not nearly accurate enough.

As integral as Raycasting is in making a game, I can’t be the only one who has run across this problem, yet my search reveals nothing.

Additional information: my player is correctly tagged as “Player”, and the ground is an imported FBX with a matching mesh collider.

Please help!

The reason your search didn’t find anything is because there is no problem. Scientific notation - Wikipedia

.0000000403… isn’t close enough to Zero for you? Are you nuts?