I know this is more of a math question but I also need to know which functions to apply in Unity.
I have a 3D radar system in my game that displays ships in space as blips on a small readout. It works fine except that there is no upper limit to the distance a blip can be from the center of the readout. As a result, it is possible to completely lose track of ships that are very far away.
I want to create an upper limit to my radar display so that when a blip goes beyond a certain distance, it “sticks” to the very edge of the display. That way I will at least know which direction to turn to pursue the ship.
I think the way to handle this is to calculate the x,y,z coordinates where a ray (drawn from the center of the radar display to the true location of the blip) intersects the upper boundary of the display, which would be a sphere of 2 units radius.
Right now the location of a blip is calculated like this:
_rblips <em>.position = (_RadarBlips *.Target.transform.position - CenterPoint.transform.position) / RadarScale;*</em>
Where Target.transform.position and CenterPoint.transform.position are positions in “real” space (Centerpoint is the player’s ship) and RadarScale = 1000.[19891-xyz_2.jpg|19891]*
If the blip goes beyond the 2 unit boundary of my display, I’d like to move it to the very edge of my display like the above pic.
*
I tried some basic trig functions but I’m not sure if I’m calculating the original angle correctly. Has anyone tried something like this?