Okay, I’m trying to find the impact force when I have a collision. Now a lot of people just use the magnitude of the relative velocity, but that doesn’t work very realistically. Case in point: If you have an object traveling very fast that just barely grazes a stationary object, the relative velocity will be the full velocity of the moving object, and if you apply a force based on that, you’re going to get WAY more force than you should since it was just a slight grazing contact.

(BTW, I noticed there’s an undocumented “impactForceSum” member of “Collision”, but it appears to just be the same thing as relativeVelocity.)

So I came up with this solution and I’m not getting values that “seem” right to me, so I just want a quick sanity check:

Take the normal of the collision (BTW, the collision normals don’t seem properly normalized as I get values of 0.3, 0.1, 0.9 which doesn’t add up to 1), then use that to scale the relativeVelocity.

So for instance, if the collision normal is (1,0,0) and your velocity is (-0.1,0,10) (moving forward and slightly to the left), then the actual impact force vector will be (-0.1,0,0) with a magnitude of 0.1. Compare that with what you get if you just use the magnitude of the relative velocity (which would be around 10).

Anyway, I have a vehicle that, when it is going up a 30-degree slope, it gives me an impact force magnitude rather close to if I ran straight into a vertical wall.

I guess that sort of makes sense, except that I’m tallying damage based upon this, and it wouldn’t make sense for a craft (like a hover craft) to take almost as much damage while climbing a hill as it would running into a brick wall.

Anyway, tell me if I’m missing something. The math seems to work out to me though.