ScreenToWorldPoint Problem

For some reason, and I can’t figure out why, the first vector printed from this statement, the one using ScreenToWorldPoint, is always (-92, 46.8, 69.8). Shouldn’t it change based on mouse position?

Thanks!

		Debug.Log(Camera.main.ScreenToWorldPoint(Input.mousePosition).ToString() + " --- " +  Input.mousePosition.ToString() );

Never tried what I’m going to tell you but take a look at this:

You have to pass a 3d vector. Now when you only pass your mouse position, the .z (depth) value will always be zero and thus exactly at your cameras position. So the position in the world will always be your camera position.

You have to pass a .z value or may use tracing.

2 Likes

As the above poster mentioned, it wants a vector3 as an argument with the z parameter being the distance from the camera in world units. Something like the following is what you’re after.

// Forum code warning

   var zdepth : float;
   var newVector : vector3;

function myfunction () {

// Retrieve mouse position
 
   var x = Input.mousePosition.x; 
   var y = Input.mousePosition.y;

 
newVector = Camera.main.ScreenToWorldPoint (Vector3 (x, y, zdepth));

}
1 Like

I am having the same problem. I am using Unity iPhone.

Debug shows that the mouse input is changing, and the viewport point is changing, but not the world point.

var input : Vector3 = Vector3(Input.mousePosition.x, Screen.height - Input.mousePosition.y, 0);
var testViewportPoint : Vector3 = Camera.current.ScreenToViewportPoint(input);
var testWorldPoint : Vector3 = Camera.current.ViewportToWorldPoint(testViewportPoint);
testWorldPoint.z = 0.0;

	    Debug.DrawLine(objectToMove.transform.position,testWorldPoint);

var layerMask : int = ~(1 << 8);
var hit : RaycastHit;
if (!Physics.Linecast(objectToMove.transform.position,testWorldPoint,hit,layerMask))
{
	Debug.Log(layerMask + " (" + objectToMove.transform.position + ") (" + input + ") (" + testViewportPoint + ") (" + testWorldPoint + ")");
	objectToMove.position = testWorldPoint;
}
else
{
	Debug.Log(layerMask + " (" + objectToMove.transform.position + ") (" + input + ") (" + testViewportPoint + ") (" + testWorldPoint + ") "+ hit.collider.gameObject.name);
}

Sample Debug:

-257 ((-191.5, -285.6, 0.0)) ((2.0, 2.0, 0.0)) ((0.0, 0.0, 0.0)) ((-191.5, -285.6, 0.0))
-257 ((-191.5, -285.6, 0.0)) ((150.0, 249.0, 0.0)) ((0.5, 0.5, 0.0)) ((-191.5, -285.6, 0.0))
-257 ((-191.5, -285.6, 0.0)) ((322.0, 458.0, 0.0)) ((1.0, 1.0, 0.0)) ((-191.5, -285.6, 0.0))

Anyone else having this problem?

ScreenPointToRay seems to work okay.

I should have mentioned that I am intentionally using the mouse position rather than iPhoneTouches as it makes the game more portable should I decide to transfer my code to another platform (such as Windows or Macintosh, or other future platform).

You should use Camera.main rather than Camera.current. Camera.current is only appropriate for a few uncommon tasks.

I can certainly try it, I don’t think that is the problem (I only have 1 camera in my scene, so current should be equal to main, right?)

1 Like

Camera.current can refer to the sceneview camera.

Slem- I have confirmed that Camera.current instead of Camera.main was the problem. Thank you for your assistance.

Great posts guys. Thanks, really helped a lot. It’s true that the function to convert from screen point to 3d world point expects a Vector3.
I my case i used Camera.nearClip to get the first plane the camera is drawing to draw some objects instead of GUI textures for my controllers.
Thanks

**Edit.
If you notive I left out Vector3.
D’oh!

I am having trouble with this.

cPt = Camera.main.ScreenToWorldPoint(aPt.x, aPt.y, camDepth);

is returning the error… that this method does not support (float, float, float).

:?

what you are looking for is: cPt = Camera.main.ScreenToWorldPoint(new Vector3(aPt.x, aPt.y, camDepth));

Shameless necro. Thanks for solutions guys. It always returns the same point when you pass mouse position directly, you should give it a Z value other than 0.

Hey, that helped me a lot! Thankya!