See i'm trying to drag an object around on the screen plane.. and i'm succeeding to some extent but i can't seem to get the original distance of the object from the camera straight.
What i'm doing is something like this:
-
Grab the world point of the initial raycast that picks the object (where it hits the object).
`Vector3 hitPoint = raycastHit.point;`
`Vector3 objectPoint = raycastHit.transform.position;`
-
Save an offset from the hit point to the object center and the distance from the camera to the object.
`worldOffset = objectPoint - hitPoint;`
`distanceFromCamera = Vector3.Distance(mainCamera.ScreenToWorldPoint(rayOrigin), hitPoint);`
-
When i've got new screen coordinates i project those into world coordinates with the distance of the initial hit point and apply the offset to the resulting point then i just place the object there.
`Vector3 pointToProject = new Vector3(finger.CurrentScreenPosition.x, finger.CurrentScreenPosition.y, finger.DistanceFromCamera);`
`Vector3 newWorldPosition = camera.ScreenToWorldPoint(pointToProject) + finger.WorldOffset;`
This seems to work.. except that the initial distance is off.. it gets curved from the cameras perspective. It all works like a charm in orthographic mode.
As discussed on IRC:
<Superpig> Southgrove, why are you calculating distanceFromCamera?
<Superpig> Southgrove, you want the object to stay fixed on the camera plane, and you want it to stay at a fixed position relative to the finger point, yes?
<Southgrove> Superpig; hmm.. yes
<Superpig> Southgrove, so, don't calculate distanceFromCamera
<Southgrove> uhm.. ok
<Superpig> Southgrove, just store (objectPoint - hitPoint)
<Superpig> and each frame
<Superpig> add it to ScreenPointToWorld(touch position)
<Superpig> to get the new position for the object
<Southgrove> but.. but..
<Southgrove> ScreenPointToWorld needs a distance to be able to project, no?
<Superpig> oh. use ScreenPointToRay instead
<Superpig> and project the ray to the plane that you want to keep the object in
<Superpig> if you want to keep the object parallel to the camera then the actual distance between the camera and the object *will* vary
<Superpig> it's correct that it will vary
<Southgrove> well worth a try
<Southgrove> aye, that's the point
<Southgrove> or uh.. expected atleast
<Superpig> right, so calculating it and projecting to the same distance each time is *definitely* wrong:)
This is old code (Unity iPhone 1.7??), but you can give it a try:
var object : GameObject;
function Update () {
for (var touch : iPhoneTouch in iPhoneInput.touches){
var ray = Camera.main.ScreenPointToRay(touch.position);
var hit : RaycastHit;
if (Physics.Raycast (ray, hit, 100)) {
if(touch.phase == iPhoneTouchPhase.Began || touch.phase == iPhoneTouchPhase.Moved) {
var cameraTransform = Camera.main.transform.InverseTransformPoint(0, 0, 0);
object.transform.position = Camera.main.ScreenToWorldPoint(new Vector3 (touch.position.x, touch.position.y, cameraTransform.z - 0.5));
}
}
}
}
Finally worked it out!
What you need to do is the following:
-
Save an offset to the raycast hit from the object position.
`Vector3 hitPoint = raycastHit.point;`
`Vector3 objectPoint = raycastHit.transform.position;`
`worldOffset = objectPoint - hitPoint;`
-
Save a plane for the following interactions (drags)
`interactionPlane = new Plane(camera.transform.forward, hitPoint);`
-
On each of the following "drags" you raycast the interaction point against the plane and position the object on the resulting point in space + the worldOffset created earlier.
`float distance;`
`Vector3 screenPointV3 = new Vector3(finger.ScreenPoint.x, finger.ScreenPoint.y, 0);`
`Ray screenRay = camera.ScreenPointToRay(screenPointV3);`
`finger.InteractionPlane.Raycast(screenRay, out distance);`
`Vector3 position = screenRay.origin + screenRay.direction * distance;`
`MoveObjectTo(position + finger.WorldOffset);`
...
Profit!