I’m trying to figure out how to make UI render by an “overlay” camera received events (pointers etc.) before UI render by the base camera.
I have a base camera that renders UI (layer) and an overlay camera that renders overlay UI (layer). The overlay camera is added to the base camera stack.
Then I have 2 canvas, one for UI and one for Overlay UI, bot on screen-scape camera and using respectively the UI camera (base) and overlay camera (overlay). I added a button on each canvas and when I clicked on the “overlay button” in the part where it is over the “UI button” the later is selected.
It’s like the base camera (which I know is render first) get the events first but it’s not really convenient because overlay UI which are rendered on top get events last.
I ´ m sorry to bump that but it really annoys me to have a button un front of another but not able to click it. Its really weird and it will be cool to be able to change, configure that.
Still struggling with this. How can I have overlay camera UI which is rendered in front of base camera receive pointer events before base camera UI ? It is driving me nuts because it makes no sense that a button behind another gets clicked when cliking the button in front !
I had the same problem and finally fixed it on my own! Set the tag of the overlay camera to “MainCamera” and set the other cameras tag to “Untagged” or anything else you want. The event system will automatically work with “MainCamera” as a priority.