I’m trying to make a re-creation of this (http://balldroppings.com/js/) simple game using Unity.

I’ve managed to successfully create flat surfaces (barriers) of the correct width and in the correct position between two vector points (mouse-down and mouse-up) and now i’m trying to get the correct angle.

However I try to get the angle, it always seems to be scaling from the distance between the two vectors, rather than the angle between the two vectors and I cannot understand why.

Picture to explain: http://i.imgur.com/pbJDp.png Notice the ‘longer’ barriers have a more inclined angle? Thats not the angles I was trying to create, it just ignores the angle between the vectors and uses the distance between the two vectors to work out the angle. How on earth is that happening?

Can anyone explain how to successfully work out the angle between two vector points and get a value between -180 and 180

```
public var mousePosDown:Vector3;
public var mousePosUp:Vector3;
var barrier : Rigidbody;
function OnMouseDown ()
{
mousePosDown2 = Input.mousePosition;
mousePosDown = Camera.main.ScreenToWorldPoint(mousePosDown2);
}
function OnMouseUp ()
{
mousePosUp2 = Input.mousePosition;
mousePosUp = Camera.main.ScreenToWorldPoint(mousePosUp2);
var between:Vector3 = mousePosUp - mousePosDown;
var distance:float = between.magnitude;
var position:Vector3 = ((mousePosUp + mousePosDown) / 2);
var angle = Vector3.Angle(mousePosDown, mousePosUp);
var rotation = Quaternion.Euler(0, 0, angle);
var barrierClone : Rigidbody = Instantiate(barrier, position, rotation);
barrierClone.transform.localScale = Vector3((distance + 0.01),0.05,1);
barrierClone.transform.position += Vector3(0,0,10);
}
```