Vector3 Maths Problem

Hi Guys,

I am trying to figure out a maths problem. I have a Vector3 position, which I want to adjust slightly through an ‘Offset’ function. I’m trying to understand the theory of how I’d do it.

Basically I have a cube at position Vector3 (lets say 10,10,10). I then have a point in the world where where it was originally thrown from (lets say 0,0,0). What I want to do is add X, Y, and Z to it’s position, but the X/Y/Z I add needs to be relatively added from it’s thrown direction, not the world’s X,Y,Z global position. So if I was to add Vector3(5,5,5) to the cube’s position, it wouldn’t be (15, 15, 15), it needs to be calculated as if I’m adding the Vector3(5,5,5) from the local direction of Vector3(cube.position - thrown.position).

What is the maths required to calculate this?

Thanks!

Vector3(cube.position - thrown.position).normalised * distance

make it a unit vector and times by the distance you want to go

1 Like

and distance would be my Vector3?? :slight_smile:

The thing is I don’t want to keep it’s trajectory and add more length to it. I want to actually change it’s landing position after the trajectory has been fired. So something a little like this:

    public bool PreFire(Vector3 firePoint, int shotsFired, float _LR, float _AD)
    {
        if (bInitialised)
        {
            if (firePoint == default(Vector3))
                Debug.Log("FireHandler::Fire - Firepoint is null!!");

            Vector3 modifiedPosition = CalculateOffsetPosition(firePoint, _LR, _AD);

            StartCoroutine(Fired(modifiedPosition, shotsFired));
            return true;
        }
        else
        {
            Debug.Log("Fire::PreFire - Unable to start Fired coroutine. This class did not initialise successfully!");
            return false;
        }
    }

    private Vector3 CalculateOffsetPosition(Vector3 firePosition, float _LR, float _AD)
    {
        Vector3 firingPoint = gwData.GetCameraPosition();
        Vector3 offset = new Vector3((_LR * 0.5f), 0, (_AD * 0.5f));
        Vector3 offsetCalculated = (firePosition - firingPoint).normalized * offset;
       
        Vector3 finalLandingPosition = firePosition + offsetCalculated;

        return finalLandingPosition;
    }

No distance is a float that is multiplying your Vector3.

In Leftys example you deduct one Vector3 from another and the result is a Vector3 that describes the distance between the two objects.

Lets call it “separation”

so

Vector3 seperation = cube.position - thrown.position;

Now if you add seperation to cube.position you would end up with thrown.position again.

Now you have the seperation vector you can scale it to any length you want.

float modifier = 1.1f;

thrown.position = seperation * modifier;

Now your thrown.position will be 10% further than it was but in the same direction.

Lefty is normlaizing the seperation vector3 which means making it a Unit length , essentially 1 (but still with the correct direction). This makes it much easier to control the distance.

Vector3 normalizedSeperation = seperation.normalized;

You can now multiply normalizedSeperation by any distance you want. If you want thrown.position to be 15 units from its origin you simply

thrown.position = normalizedSeperation * 15.0f;

I see you ninja’d a reply in while I was typing, sorry if I’m telling you stuff you already know :wink:

No it’s awesome you guys are helping me, thanks :).

What I’m exactly after though is actually calculating an offset left/right/up/down position, after the final position has been generated. So I’m not scaling further down the trajectory line. Instead I’d like to add an X/Y amount to the final position. However the X/Y is not added from global/world position, it’s added from the local position, considering the trajectory is ‘forward’. Does that make sense?

So if I was to have it as say: offset Vector3(-50, 100)… What it would mean is looking forward from the beginning of the trajectory path, you would see the final hit point to be 50 unity units left of it’s normal landing postion, and 100 units further away than normal. So the Unity units are being added as if the world forward, was the trajectory line’s forward.

you have values for relative modification.

thrown.transform.position += thrown.transform.right;

fill give you the position 1 unit 90 degrees right from the objects orientation.

you can use up, -up, or -right as well and scale them as in my previous description.

That sounds perfect. So technically if I wanted to I could just create a temporary gameObject at the position, with the rotation as if it’s looking down the trajectory line, and then create a position using it’s Transform. That sounds like a simple solution that’d work. Ideally I’d love to know the actual maths to do this without making a temporary GameObject… Not because it’s not possible that way, but because I want to learn how to do it correctly without any hacks :stuck_out_tongue:

Is it like this?
first is your 2 vectors
second is what you get if you just add the 2 together
third is what happens if you use the first vector as a reference for what ‘up’ is with respect to the second vector

If so, the code is:

Vector3 v1 = ...;
Vector3 v2 = ...;
Quaternion q = Quaternion.FromToRotation( Vector3.up, v1 );
Vector3 result = v1 + q * v2;

In 3 dimensions, you need to decide what the reference direction is (Line 3 - i have used up, you’ll likely want to use forward)

Well you can make your thrown object face along its trajectory

use my original example to get the position one unity further along the trajectory then LookAt() that position.

the forward, up and right will all be relative to the objects position, forward continuing along its current trajectory.

Hi John,

That looks like what I’m after. I’ve tried recreating it.

    private Vector3 CalculateOffsetPosition(Vector3 firePosition, float _LR, float _AD)
    {
        Vector3 v1 = firePosition;
        Vector3 v2 = new Vector3((_AD * 0.5f), 0, (_LR * 0.5f));
        Quaternion q = Quaternion.FromToRotation(Vector3.forward, v1);
        Vector3 result = v1 + q * v2;
        result.y = Terrain.activeTerrain.SampleHeight(result) + 0.5f;

        return result;
    }

It seems almost perfect! It creates an offset from the target, which is at a different angle than the world coordinates. The only issue is that the angle doesn’t seem to align perfectly from the original location it was sent from.

Basically what I have is the position where the cube lands. I have the starting position, where it was thrown from, and I have the offset position where I want it to land. Your diagram seems to explain exactly what I’m after… However when I make an offset, it seems to do it purely from the v1’s rotation and position… Not relative to where it was throw as well… So it’s angle seems to go off track slightly. Are you able to imagine what I’m saying?

Thanks for the help guys!

Not really
When you talk about firing trajectories and such usually you want arcs and parabolas, so all this straight line stuff is couter intuitive

draw us a piccy?