Is rendering to RenderTexture changing output of Camera.ScreenToWorldPoint()?

Hello there!

What am I doing?

I’m attempting to draw a worldspace Canvas on a RenderTexture and have the mouse events function properly.

What happens?

The moment I make the UI camera draw on a RenderTexture it seems to confuzzle the transformation from screen to world point. That apparently causes a wrong world position within the uGUI event system and my buttons won’t receive mouse-over or mouse-out events.
Kind of like this:


I’m debugging that by fetching a world position from the camera and drawing a way from there
in the forward direction like so:

var pos = uiCanvas.worldCamera.ScreenToWorldPoint(Input.mousePosition);
Ray r = new Ray(ev.position, Vector3.forward);
Debug.DrawLine(r.origin, r.GetPoint(1000), Color.blue);

While the mouse, on the screen, is above a button the raycast comes out at a completely shifted-away position. The moment I make the camera render to screen, things start to work normally again.

I confirmed that the uGUI event system has the same off-set position by moving stuff around until the wrong mouse position was above the buttons and the events worked.

I also tried to convert screen to viewport point first and from there to world. But to no avail. Still wonky.

So my question is, what needs to be done to account for that offset?
It seems to vary from resolution to resolution and I can’t find the correct values for the life of it.

It’s a bit hard to tell without seeing the whole hierarchy and scene, but this may help

RectTransformUtility.ScreenPointToLocalPointInRectangle Unity - Scripting API: RectTransformUtility.ScreenPointToLocalPointInRectangle

I have used this to convert a screen point > render texture rect > its camera > ray cast > button thing
Once you’re in the rect you should convert to a [0,1] viewport space instead of pixels.
Good luck!