Calculating the players view with the camera


I’m working on a 2D scroll video game. It’s a shooter with a dynamic camera depending on wich weapon the player take.

alt text

RED LINE: Player view == Weapon distance.
YELLOW LINE: Distance Player to Camera. The angle is taken from the Field of View Camera value (originally is 42).

I want to calculate the Distance Player to Camera depending on the weapon distance or vice versa, it’s the same problem. And that’s basic trigonometry. To take the weapon distance: b = tg(21) * 10; b = 3.83 units.

The problem is that this value disagrees with the test I made:

TEST 1: I created a basic box and scale it to fill the red line distance and it measured more than 6 units.

TEST 2: In case I was wrong with the first test, I draw a vector from the player in the red line direction with distance 3.83 units and it was as short as the numbers are telling me.

What I’m doing wrong?


Great question, with good illustration.

What you’re doing wrong is that you’re using half the camera’s field of view to arrive at the 21 degrees. But the field of view is the vertical field of view, not the horizontal. So that angle is not equal to 42/2 = 21.

You can convert the vertical field to view to the horizontal with the aspect ratio and a little bit of trig, please see the replies in this forum post: