I am trying to figure out a maths problem. I have a Vector3 position, which I want to adjust slightly through an ‘Offset’ function. I’m trying to understand the theory of how I’d do it.
Basically I have a cube at position Vector3 (lets say 10,10,10). I then have a point in the world where where it was originally thrown from (lets say 0,0,0). What I want to do is add X, Y, and Z to it’s position, but the X/Y/Z I add needs to be relatively added from it’s thrown direction, not the world’s X,Y,Z global position. So if I was to add Vector3(5,5,5) to the cube’s position, it wouldn’t be (15, 15, 15), it needs to be calculated as if I’m adding the Vector3(5,5,5) from the local direction of Vector3(cube.position - thrown.position).
The thing is I don’t want to keep it’s trajectory and add more length to it. I want to actually change it’s landing position after the trajectory has been fired. So something a little like this:
public bool PreFire(Vector3 firePoint, int shotsFired, float _LR, float _AD)
{
if (bInitialised)
{
if (firePoint == default(Vector3))
Debug.Log("FireHandler::Fire - Firepoint is null!!");
Vector3 modifiedPosition = CalculateOffsetPosition(firePoint, _LR, _AD);
StartCoroutine(Fired(modifiedPosition, shotsFired));
return true;
}
else
{
Debug.Log("Fire::PreFire - Unable to start Fired coroutine. This class did not initialise successfully!");
return false;
}
}
private Vector3 CalculateOffsetPosition(Vector3 firePosition, float _LR, float _AD)
{
Vector3 firingPoint = gwData.GetCameraPosition();
Vector3 offset = new Vector3((_LR * 0.5f), 0, (_AD * 0.5f));
Vector3 offsetCalculated = (firePosition - firingPoint).normalized * offset;
Vector3 finalLandingPosition = firePosition + offsetCalculated;
return finalLandingPosition;
}
Now if you add seperation to cube.position you would end up with thrown.position again.
Now you have the seperation vector you can scale it to any length you want.
float modifier = 1.1f;
thrown.position = seperation * modifier;
Now your thrown.position will be 10% further than it was but in the same direction.
Lefty is normlaizing the seperation vector3 which means making it a Unit length , essentially 1 (but still with the correct direction). This makes it much easier to control the distance.
No it’s awesome you guys are helping me, thanks :).
What I’m exactly after though is actually calculating an offset left/right/up/down position, after the final position has been generated. So I’m not scaling further down the trajectory line. Instead I’d like to add an X/Y amount to the final position. However the X/Y is not added from global/world position, it’s added from the local position, considering the trajectory is ‘forward’. Does that make sense?
So if I was to have it as say: offset Vector3(-50, 100)… What it would mean is looking forward from the beginning of the trajectory path, you would see the final hit point to be 50 unity units left of it’s normal landing postion, and 100 units further away than normal. So the Unity units are being added as if the world forward, was the trajectory line’s forward.
That sounds perfect. So technically if I wanted to I could just create a temporary gameObject at the position, with the rotation as if it’s looking down the trajectory line, and then create a position using it’s Transform. That sounds like a simple solution that’d work. Ideally I’d love to know the actual maths to do this without making a temporary GameObject… Not because it’s not possible that way, but because I want to learn how to do it correctly without any hacks
Is it like this?
first is your 2 vectors
second is what you get if you just add the 2 together
third is what happens if you use the first vector as a reference for what ‘up’ is with respect to the second vector
It seems almost perfect! It creates an offset from the target, which is at a different angle than the world coordinates. The only issue is that the angle doesn’t seem to align perfectly from the original location it was sent from.
Basically what I have is the position where the cube lands. I have the starting position, where it was thrown from, and I have the offset position where I want it to land. Your diagram seems to explain exactly what I’m after… However when I make an offset, it seems to do it purely from the v1’s rotation and position… Not relative to where it was throw as well… So it’s angle seems to go off track slightly. Are you able to imagine what I’m saying?