Hey, I’m trying to make a script that when on the android device a touch is registered and it places a gameobject where the touch happened. It does instantiate the object on touch, but only at the 0 orgin point. No matter where I touch.
public class SpawnScript : MonoBehaviour {
public GameObject place;
void Update()
{
if (Input.GetTouch(0).phase == TouchPhase.Began)
{
Vector3 touchPos = Input.GetTouch(0).position;
Vector3 createPos = Camera.main.ScreenToWorldPoint(touchPos);
Instantiate(place,createPos, Quaternion.identity);
}
}
}
Camera.ScreenToWorldPoint requires that the Z field of the input vector be nonzero. Try putting the camera’s .nearClipPlane value into touchPos.z and see if it works better.
hi, you didn’t specify if your game is 3D or 2D, or 2.5D…
Either way, in 3D your touch itself doesn’t do it.
Think about it - when you touch screen, you are basically touching glass plane in front of your virtual camera, which is most likely far away from the point you are looking at. If you wan this, what @Kurt-Dekker said most likely works, you’ll get the point on plane in front your camera.
But if you want to place an item in the world, in front of your tap on screen, you need Camera.ScreenPointToRay.
This does a raycast from tap position (on your camera “glass”) . The hit position is where you want to place something. So you basically tap screen, shoot a ray, where ray hits, is your object location.
Thanks, it now works like I wanted it to. I’m new to touch input and instantiating objects so thanks alot. I used also what eses said and now it works wonderfully thanks you guys!