I am making a simulated Inertial Measurement Unit (IMU) which is a device that detects acceleration (linear and angular but lets focus on linear first).
I found this script and based mine on it but converted to c#: http://wiki.unity3d.com/index.php/IMU_Inertial_Measurement_Unit
Basically it just records the position for each step and uses the difference divided by deltatime to get speed, and then records the speed and uses the difference divided by deltatime to get acceleration.
Problem is that some frames seem produce inaccurate numbers. For example if I let an object fall freely, the script doesn’t report a smoothly increasing Y value which would give a smoothly increasing speed and a constant acceleration. Instead, the Y value is slightly off every time it records, so sometimes the velocity increases twice as much as it did the last step, and sometimes it even goes backwards for one step. This of course completely messes up the acceleration calculation. So basically, the position is slightly inaccurate, which causes a much bigger inaccuracy in its derivation, which causes and even bigger inaccuracy in that derivation.
So what’s a way to get more accurate position values for the exact time? What am I doing wrong?