Hey Guys!

So lately I have been wrestling with a problem. I have been trying to find an answer for a while now and sadly all I have found didn’t really help. So basically what I want to achieve is to find a point (U) on a circle via a given center-point (M) (2/4 in this picture) (tho can be everything), and a given point p (3/6), this point may also vary. The radius of the circle (r) is also given (here r = 1). I have figured that if I just subtract p by m I get the direction vector ‘MP’ and then just multiply this by the radius to get ‘U’, but that’s obviously wrong *teardrop*. Does anyone have an idea on how to fix my problem?

Much love and thanks in regards ~ Julian

You are in general on the correct path. What you are missing in your calculation is that you need to normalize the vector you get.

```
PointOnCircle = (P -M).normalized * Radius;
```