2D Touch Input Problem with ScreenToWorldPoint

I’m trying to get my virtual joystick to work with my game. I set everything up and got it all working using Input.GetMouseButton and using the mouse position. It worked perfectly. The touch positions were relative to what was on the screen. E.g. If I touched 3 units away from the joystick, the game reacted as such.

Picture Reference: http://puu.sh/943hC.png

(Random grass and the virtual joystick in the corner).

But since this is a multitouch game, I switched over to this code:

for (var touch : Touch in Input.touches){
			ray = Camera.main.ScreenPointToRay(touch.position);
			hit2d = Physics2D.Raycast(Camera.main.ScreenToWorldPoint(Vector2(touch.position.x/2, touch.position.y/2)), Vector2.zero);
			var fingerCount = 0;
			if (touch.phase == TouchPhase.Ended || touch.phase == TouchPhase.Canceled){
				leftFinger = false;
			if (touch.phase == TouchPhase.Began){
				if (hit2d.transform.tag == "Joystick"){
					leftFinger = true;

However when I check the position of the touch with: Input.GetTouch(0).position, the game is giving me a point off of the screen, and the joystick goes wild. I’m assuming touch has a different unit measurement or something from mouse input? Can someone please help me.

Don’t use Physics2D for GUITextures.

Use Physics.

ScreenToWorldPoint has a Vector3 input. The z represents the distance from the camera that you wish to shoot your ray.

There is already a prefab Joystick in Unity that supports touch. If you cant find it in the Standard Assets Import theres a version in AngryBots tutorial.

On that note, I don’t think I’ve ever used Raycast for GUI touches.

I use something like

gui.HitTest( touch.position )

where gui is GetComponent(GUITexture) for the Joystick.