Why would Vector2.Angle return wrong values

I run this code

float angle = Vector2.Angle(d, d2);


The debug log then reads:

(1.0, 0.0)
(0.0, 1.0)

When obviously the Debug.Log(angle) line should output 90.0. The same thing happens if I use Vector3.Angle by constructing Vector3s out of d and d2 and leaving their z values at 0.0f.

This is only happening if the Vector2s I use are calculated though. If I manually input

d[0] = 1.0f;
d[1] = .0f;
d2[0] = .0f;
d2[1] = 1.0f;

before running Vector2.Angle() it returns the expected 90.0f. But that is, of course, of no use. I’m trying to find if the directions of two line segments are or are very close to 90°. Does anyone know why Vector2.Angle() is returning such odd values, or know of a better way to determine if two direction vectors are 90° different from each other?

If it makes any difference the vectors d and d2 are computed as simply

Vector2 d = (b - a).normalized;
Vector2 d = (c - b).normalized;

With a, b, and c being vectors that are input in the inspector.

I guess Vector2.ToString() is rounding values for nice output and your angle is correct. Implicating that function returns 2 different values is kinda off (UNLESS you get different values EVERY time. that would mean V2.angle is undeterministic and has a bug. Unlikely though).