Why "perfect down" raycasts on a 90 quad rotated on x axis return hit distances with differences of 0.000002?

I’m raycasting on a plane, to make some time map data, based on the height of the mesh on a current point.

I’m using for loops to raycast from positions on this mesh, that is currently an unity3d quad mesh. It was rotated 90 degrees on x axis, typed on the inspector.

This is part of my routine to do the job, inside the for loop:
Vector3 origin = new Vector3(x-(Mathf.Ceil(mapWidth/2)), 10, y-(Mathf.Ceil(mapHeight/2)));
if (Physics.Raycast(origin,Vector3.down, out hit, 200, 1 << MainGame.layerTerrain))

Then I do store the hit.distance on an array\list.

But, using Debug.Log to see the results, on this perfect plane, I do get the following results on all rows:

Someone knows why theres different hit distances for this somewhat “perfect plane”?

I don’t see any differences of 0.002. Are you getting anything with that large of an offset? What you have here (0.000002 diff) looks like floating point error to me, nothing you can do about that. But even if the number you printed out was 10 above, if you did a comparison such as "if (val == 10.0f), it would almost certainly never be true. In the hardware, it could be only a single bit off, and the print routine round it up, so you don’t always see the exact value printed out that is represented in the hardware. That’s why you never do floating point == tests unless it is for a test against 0 or a test against a value you explicitly set it to before. It’s even worse with double as far as the print case goes.