Why are my Vector3s returning inaccurate floats? I don't think it's the debug .ToString() issue.

I am trying to simulate a free-fall in a vacuum with an acceleration of -9.81m/s^2 along the y axis.

This is my code:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class behavBall : MonoBehaviour
    //!!! Declare Variables:
    [SerializeField] private GameObject goWorldManager;
    private static Vector3 v3InitialRestingVelocity = new Vector3(0f, 0f, 0f);
    private Vector3 v3MyGravityAcceleration;
    private Vector3 v3CumulativeAcceleration;
    private Vector3 v3PreviousPosition;
    private Vector3 v3PreviousVelocity;
    float fTimeStepScaled;

    void PerSecond()
        // : Print out the displacement and previous velocity every second.
        Debug.Log("TIME: "+Time.time+", DISPLACEMENT: "+transform.position.y+", PRE-VELOCITY: "+v3PreviousVelocity.y);

    void Start()
        //!!! Initialize Variables:
        v3MyGravityAcceleration = new Vector3(0f, -9.81f, 0f);
        v3PreviousVelocity = v3InitialRestingVelocity;
        v3PreviousPosition = transform.position;

        InvokeRepeating("PerSecond", 1f, 1f); 

    void FixedUpdate()
        fTimeStepScaled = Time.fixedDeltaTime;

        v3CumulativeAcceleration = v3MyGravityAcceleration;

        Vector3 v3CurrentVelocity = v3PreviousVelocity + (v3CumulativeAcceleration*fTimeStepScaled);
        Vector3 v3Displacement = ((v3CurrentVelocity+v3PreviousVelocity)/2f)*fTimeStepScaled;
        transform.position = v3PreviousPosition + v3Displacement;

        v3PreviousVelocity = v3CurrentVelocity;
        v3PreviousPosition = transform.position;

By my calculations my gameobject should have travelled from 0units to -4.905units in the first second and its previous velocity should be -9.81m/s^2, but when I print out the displacement after 1 second it has travelled -5,1units.and has a previous velocity of -10.

It looks like the floats are being rounded inside the calculations I’m doing with the Vectors, but if I do the same calculations using just floats the values are accurate.

What am I missing here?

Call the ToString method of a float explicitly using the number of decimals parameter. For example:


to my knowledge, 7 decimals is the maximum a float can have.

EDIT: It turned out that I wasn’t accounting for the initial execution of FixedUpdate at the start of run-time. Once I did that the values were accurate.

by Streeetwalker:

“OK, it must be that the fixedDeltTime you are using is not in sync with your repeating function - it’s producing larger values than you expect. I bet if you use Time.deltaTime in your fixed update instead you will get more accurate values in accord with your manual calculations.”

  • Using Time.deltaTime did not make a difference but it did turn out to be a syncing issue between game-time and real time.

This, shared by Streeetwalker, helps: Fix your (Unity) Timestep! — John Austin