What am I doing?
I’m attempting to draw a worldspace Canvas on a RenderTexture and have the mouse events function properly.
The moment I make the UI camera draw on a RenderTexture it seems to confuzzle the transformation from screen to world point. That apparently causes a wrong world position within the uGUI event system and my buttons won’t receive mouse-over or mouse-out events.
Kind of like this:
I’m debugging that by fetching a world position from the camera and drawing a way from there
in the forward direction like so:
var pos = uiCanvas.worldCamera.ScreenToWorldPoint(Input.mousePosition);
Ray r = new Ray(ev.position, Vector3.forward);
Debug.DrawLine(r.origin, r.GetPoint(1000), Color.blue);
While the mouse, on the screen, is above a button the raycast comes out at a completely shifted-away position. The moment I make the camera render to screen, things start to work normally again.
I confirmed that the uGUI event system has the same off-set position by moving stuff around until the wrong mouse position was above the buttons and the events worked.
I also tried to convert screen to viewport point first and from there to world. But to no avail. Still wonky.
So my question is, what needs to be done to account for that offset?
It seems to vary from resolution to resolution and I can’t find the correct values for the life of it.