I am trying to create a virtual mouse cursor driven by a gamepad. This cursor is meant to interact with a UI canvas (buttons etc) on a second display (monitor). At the same time the “real” mouse cursor should still interact with another UI canvas on the first display.
What I’ve noticed so far is that both the real and virtual mice only interact with the first display UI when I start the game. When I click on somewhere on the second display with the real mouse, both mice now interact with the second screen UI.
So from what I can tell it is not possible to have 2 individual mice/pointers control two different UIs on separate displays.
Can anyone confirm or deny this? At this point I just want to prevent wasting further time on something Unity might not even support.
I encountered the same issue when dealing with a real and virtual mouse on two different displays. When I clicked on a screen with the real mouse, both the real and virtual mouse’s button clicks were registered on the same display, making the game unplayable. During build time, the screen with the virtual mouse was also shifted for some reason, and the buttons on the second screen weren’t working at all.
The only workaround to this I have is to change the canvases to world space, putting a camera on each canvas, overlaying those cameras onto the player displays, and then using raycasting by adding box colliders to each button on the canvas with the virtual screen to detect clicks. This makes it so that no matter which display was clicked on (and focused on), you can always use the virtual mouse to click on buttons on the second screen. Then, you can disable the virtual mouse (and make it just a pointer) because you aren’t really using the button event properties anymore since raycasting is the substitute.
I’m surprised there wasn’t a simpler solution (like moving the canvases apart manually when they are screen space overlayed). That would have made it a lot easier since then you can use the button events.