So if have this Script:
If I replace inputDirection = inputDirection.normalized
the Vector is going to be normalized but why inputDirection.Normalize (); dosen’t do the same?
public virtual void OnDrag(PointerEventData ped){
Vector2 pos = Vector2.zero;
if(RectTransformUtility.ScreenPointToLocalPointInRectangle(
bgImage.rectTransform,ped.position,ped.pressEventCamera, out pos))
{
pos.x =( pos.x / bgImage.rectTransform.sizeDelta.x);
pos.y =( pos.y / bgImage.rectTransform.sizeDelta.y);
float x = (bgImage.rectTransform.pivot.x == 1) ? pos.x * 2 + 1 : pos.x - 1;
float y = (bgImage.rectTransform.pivot.y == 1) ? pos.y * 2 + 1 : pos.y - 1;
inputDirection = new Vector3 (x,0,y);
//inputDirection = (inputDirection.magnitude > 1) ? inputDirection.normalized : inputDirection;
if (inputDirection.magnitude > 1) {
inputDirection.Normalize ();
Debug.Log (inputDirection);
}
joystickImage.rectTransform.anchoredPosition = new Vector3 (inputDirection.x * (bgImage.rectTransform.sizeDelta.x /3),
inputDirection.z * (bgImage.rectTransform.sizeDelta.y /3) );
}
[code]
Y’know, I was going to post that Vector3.Normalize() is actually a static function that returns a Vector3 , but then I realized there’s also a non-static Normalize() function that’s undocumented. At least for Vector3. However, it is documented for Vector2. Oh Unity…
I just tried a test, and it seemed to work fine:
[MenuItem("Window/Test")]
public static void Test()
{
Vector3 test = new Vector3(0.1f, 0.1f, 0.1f);
Debug.Log(test.ToString("0.000"));
var test2 = Vector3.Normalize(test);
Debug.Log(test2.ToString("0.000"));
var test3 = test.normalized;
Debug.Log(test3.ToString("0.000"));
test.Normalize();
Debug.Log(test.ToString("0.000"));
}
Output:
(0.100, 0.100, 0.100)
(0.577, 0.577, 0.577)
(0.577, 0.577, 0.577)
(0.577, 0.577, 0.577)
I’m wondering if Debug.Log is actually tripping you up. By default, Vector3.ToString() only shows the first decimal, rounded up. Hence my “0.000” formatting.
1 Like
BlackPete:
I just tried a test, and it seemed to work fine:
[MenuItem("Window/Test")]
public static void Test()
{
Vector3 test = new Vector3(0.1f, 0.1f, 0.1f);
Debug.Log(test.ToString("0.000"));
var test2 = Vector3.Normalize(test);
Debug.Log(test2.ToString("0.000"));
var test3 = test.normalized;
Debug.Log(test3.ToString("0.000"));
test.Normalize();
Debug.Log(test.ToString("0.000"));
}
Output:
(0.100, 0.100, 0.100)
(0.577, 0.577, 0.577)
(0.577, 0.577, 0.577)
(0.577, 0.577, 0.577)
I’m wondering if Debug.Log is actually tripping you up. By default, Vector3.ToString() only shows the first decimal, rounded up. Hence my “0.000” formatting.
I figured it out the problem was with the fact the I have initialized inputDirection like this:
public Vector3 inputDirection{ set; get;}
1 Like
It’s works fine:
“When normalized, a vector keeps the same direction but its length is 1.0.”
from Unity - Scripting API: Vector3.Normalize
leight of vector =√(0.5777)^2 + (0.5777)^2 + (0.5777)^2 = √1.00121187 ≈ 1.000605751
for example, Vector2 will be normalized like this:
1 Like