I’m encountering an issue when changing a Canvas
render mode at runtime. Specifically, I instantiate some UI Canvas (e.g., a pause menu) from a prefab, and I want it rendered by a UI Camera
that’s in another prefab. So, in a Start
method of my MonoBehaviour script, I set the canvas’s renderCamera
to this UI Camera
. It visually renders correctly, however, the buttons stop working.
I did some debugging to find out why. I checked if the GraphicRaycaster
was set correctly, ensured the event camera was properly assigned, and verified that the UI layer
wasn’t being culled. Everything seemed fine, and the raycast system correctly detects the button when I hover over it, but the button still doesn’t respond to clicks.
To further investigate, I tried changing the Canvas
render mode through the inspector at runtime. I switched from:
- Screen Space - Overlay (where the button works)
- Then to Screen Space - Camera (where the button stops working)
- Finally back to Screen Space - Overlay (the button still doesn’t work).
It seems like once the render mode is changed, the event system stops functioning correctly, and even switching it back doesn’t restore the button’s functionality.
So my question is: Are we unable to change the Canvas render mode at runtime without breaking the event system, or is there some setup I’m missing that could make this work?
Has anyone faced this before?