Screenspace position from worldspace is giving wrong data

I am trying to draw a UI corresponding to a 3D game object in the scene. The target game object is out of the screen. So I would draw an arrow in the screen’s corner in the direction of the target. Here is my scene setup.


The selected one is the target. The green is just a circular platform. I am using the following code to get:

 [SerializeField] Camera cam;
    [SerializeField] Transform whiteSphere, target;
    [SerializeField] Vector3 targetWorldPos, targetSSPos, clampedSSPos;
    [SerializeField] Transform uiFollower;

    // Update is called once per frame
    void Update()
    {
        targetWorldPos = target.position;
        targetSSPos = cam.WorldToScreenPoint(targetWorldPos);

        clampedSSPos = targetSSPos;
        clampedSSPos.x = Mathf.Clamp(clampedSSPos.x, 10, Screen.width - 10);
        clampedSSPos.y = Mathf.Clamp(clampedSSPos.y, 10, Screen.height - 10);
        clampedSSPos.z = 0.0f;
       
        uiFollower.position = clampedSSPos;

        Debug.DrawLine(whiteSphere.position, target.position, Color.red);
    }

And my hierarchy:

For the result below, the screen space position(Target SS Pos) shown here is incorrect, isn’t it? “Y” value should be negative, isn’t it?

And the result:

So what has gone wrong?

So apparently my camera is angled and the screenspace value exceeds maximum value of float. Then it gives garbage value. Also the output screenspace value from “Camera.WorldToScreenPoint” is not consistent either. Is it a bug?

Line 17 above uses screen position to drive the uiFollower… are you sure the uiFollower is in that space? It could be, but check first (eg, hard-wire (10,10) and then (max-10,max-10) positions and see if those do what you expect) because that is obviously dependent on the Canvas and CanvasScaler setup.

You must find a way to get the information you need in order to reason about what the problem is.

What is often happening in these cases is one of the following:

  • the code you think is executing is not actually executing at all
  • the code is executing far EARLIER or LATER than you think
  • the code is executing far LESS OFTEN than you think
  • the code is executing far MORE OFTEN than you think
  • the code is executing on another GameObject than you think it is
  • you’re getting an error or warning and you haven’t noticed it in the console window

To help gain more insight into your problem, I recommend liberally sprinkling Debug.Log() statements through your code to display information in realtime.

Doing this should help you answer these types of questions:

  • is this code even running? which parts are running? how often does it run? what order does it run in?
  • what are the values of the variables involved? Are they initialized? Are the values reasonable?
  • are you meeting ALL the requirements to receive callbacks such as triggers / colliders (review the documentation)

Knowing this information will help you reason about the behavior you are seeing.

You can also supply a second argument to Debug.Log() and when you click the message, it will highlight the object in scene, such as Debug.Log("Problem!",this);

If your problem would benefit from in-scene or in-game visualization, Debug.DrawRay() or Debug.DrawLine() can help you visualize things like rays (used in raycasting) or distances.

You can also call Debug.Break() to pause the Editor when certain interesting pieces of code run, and then study the scene manually, looking for all the parts, where they are, what scripts are on them, etc.

You can also call GameObject.CreatePrimitive() to emplace debug-marker-ish objects in the scene at runtime.

You could also just display various important quantities in UI Text elements to watch them change as you play the game.

If you are running a mobile device you can also view the console output. Google for how on your particular mobile target, such as this answer or iOS: How To - Capturing Device Logs on iOS or this answer for Android: How To - Capturing Device Logs on Android

Another useful approach is to temporarily strip out everything besides what is necessary to prove your issue. This can simplify and isolate compounding effects of other items in your scene or prefab.

Here’s an example of putting in a laser-focused Debug.Log() and how that can save you a TON of time wallowing around speculating what might be going wrong:

The code does execute. If the camera is top down and tilted around x axis(mine is around 43 degree) then above certain amount of world space movement corresponds to > float max value in screen space. When this happens, wrong data is given by the API. You test it out yourself. I have attached a unity package. All there is a script and a scene, nothing else. Then move the “worldObject” along the ZX plane and see for yourself.

8378235–1104576–unityBug.unitypackage (3.64 KB)