So I have an object with no drag that is launched a very far distance from a completely flat surface and lands again on this completely flat surface. In air the project should be under no force except of that of gravity. Prior to being launched I want to calculate the distance between the point it starts from and the point it will land on.
As wikipedia says the equation should be: d = (v^2 * sin(2 * angle) ) / g.
What I have is:
Mathf.Abs(((velocity * velocity) * Mathf.Sin(2 * angle * Mathf.Deg2Rad)) / Physics.gravity.magnitude)
But it’s not working. The actual distance it travels is always a little less than what I calculate, for example what I calculated on my last launch was a distance of 2744.497 but instead got 2707.329. Please help me, this is driving me mad. I don’t know if there is something that is preventing the projectile from acting like a real life projectile, or if my calculation is incorrect. I am even using doubles instead of floats to avoid floating point inaccuracy.
If it helps, I calculate the angle with:
var angle : double = Vector3.Angle(direction, Vector3(direction.x, 0, direction.z).normalized);
Where direction is the direction the projectile is launched.