Would like some help with some vector math

Hi am stupid and i have a math problem. I’ve been working on a goalkeeper which reads the position of the ball along his axis and plays the corresponding animation through a blend tree, i.e if the value is -1 then the keeper will dive to his right side and so on.

It all works great if am shooting from the center of the pitch and where 0 = play middle animation , but if am to shoot from sideways or if the goalkeeper is rotated to face the ball then it plays the wrong animations since am using values from the default world axis.

So my question is… how do i normalise that ball value which gets send to the blend tree so that it takes into account the directions that the forward vectors of the 2 reference points are facing, so that the value will always be 0 if they are facing directly at each-other. Or in english, how do i zero in on the ball?

The dot product of two normalised vectors shows if the vectors are pointing the same way. 1.0f is 100% the same -1.0f is complete opposite.

Thanks for the reply. I just tried this Unity - Scripting API: Vector3.Dot and it seems to return the magnitude of the vector between the 2 points instead of what’s being described.

Edit: Just noticed that the vector needs to be normalised first, brb checking it out.

I just found vector3.cross which gives me exactly what i wanted, the output value even seems to match my blend tree perfectly, thanks again for pointing me in the right direction.