I am trying to implement the below formula.
In the equations on this page, the following variables will be used:
g: the gravitational acceleration—usually taken to be 9.81 m/s2 near the Earth’s surface
θ: the angle at which the projectile is launched
v: the velocity at which the projectile is launched
y0: the initial height of the projectile
d: the total horizontal distance traveled by the projectile
I scripted that formula as
distance = (((cannonScript.velocity*Mathf.Cos(cannonScript.angle))/gravity)*(cannonScript.velocity*Mathf.Sin(cannonScript.angle))+(Mathf.Sqrt(Mathf.Pow(cannonScript.velocity*Mathf.Sin(cannonScript.angle),2)+(2*gravity*ini_height))));
I have done O/P the result, distance in the console window. The angle is degrees and I think I just receive it as a float value and pass it. I hope it is degrees in the right way. The initial height is 0 and gravity is 9.81.
I have placed this script on a turret model from which the cannon prefab is launched every time.
I get even negative values in the console window. For example if the angle is 31 and velocity is 11 my value (distance) is -0.1142434. I think it makes no sense.
My usage of angle and initial velocity are given by the following script.
I am using the same angle and velocity.
My humble request is could someone help me in this. I have been working & not sure where I am wrong. I do also believe that all the values I have given as an input correspond to their respective real world values (metrics).
THANK you in advance