Finding RayCastHit's Origin Position

I’m using Rigidbody.SweepTest to let my AI know when to jump. Problem is, when going uphill it won’t stop jumping because the ray it makes will hit the ground next to it no matter what distance I put in because what it does is making rays from every point in my AI’s body (so if it’s the lowest point it will always hit the ground when going uphill). I only want my AI to jump when going uphill when the slope is 45 and bigger (including 90 of course). So what I did is using the RayCastHit I got from the Rigidbody.SweepTest to do that:

``````MyRigidbody.SweepTest(transform.forward, out hit, float.MaxValue);
if(Vector3.Distance(transform.position, ORIGIN) / hit.distance >= 1)
// Jump
``````

Problem is that I don’t know how to get the origin point of the ray (it’s always different it goes out from different points of my AI’s body). Also if there’s anything wrong with my other code please let me know I’m basically finding the slope by dividing the distance from my AI’s lowest point (transform.position) to the ray origin by the distance from the ray origin to the ray hit point.

For this situation, do not use transform.position as your ray. Instead use Ray instead. You can define the direction you want it to go and you can define the origin of the ray.

References: Unity - Scripting API: Ray

Okay I have found the solution. When you have two points, the distance between them and the direction between them this is true:
direction = (B - A) / distance
So we can use that with our properties to do:
origin = hit.point - hit.distance * direction