I’m working on a project in Unity 5.3 w/ multiple display support. I’m rendering a separate canvas to each display in fullscreen native resolution. The idea is that one display is a touchscreen for input while the other is a regular non-touch display for output.
Everything works great in the editor. When running a build, everything displays well, but no input from the touchscreen is registered when touching UI buttons.
I found this post from an old beta thread that seemed to say that input didn’t work on multiple displays, but the thread hasn’t been updated in several months.
Is there any update on this? Is it just impossible to get this working currently? Are there any workarounds?
We disabled the multiple display support in 5.3 as it had a bug that caused none native resolutions to report incorrectly. This broke all ui so we had to disable it. Unfortunately it is still not fixed and will not be fixed in 5.4. We are waiting on the multiple display team but its not a simple fix. So for the moment multiple display and ui do not work together. Once its fixed we will backport to a patch if possible.
So, if I need to develop a multi-display app designed for two 1080p monitors, for example, is the best workaround to just build a single canvas at 1920x2160 resolution (1080p x 2) and rely on the OS virtual desktop to stretch the app across multiple displays?
I’ve done this in the past and it seemed to work OK. But when I saw that proper multi-display support was added in 5.3, I was eager to give it a try.
I’ve been testing builds w/ 5.3.1 and they’ve been working and displaying properly. It’s just that input (registering touches on Unity UI buttons) hasn’t been working.
Are you saying input works properly in 5.3.0? I am planning on full native res for both screens, so, if that’s true, it’s certainly a possible solution for me.
No unfortunately significant issues were found in the multiple display system so it’s with the Graphics team at the moment. I’ll chase it up and see what the status is.
Karl, were you able to get an update on this patch or it’s release date. My team is working on a project which utilizes multi screen and would like to get some info or possible work arounds.
This issue is not fixed yet. I need to chase up with the graphics team but dont expect a fix for some time.
How is your multiple display system going to work? How many screens and will the UI be on all of them?
Is there any update on this bug? I’m currently running 5.5.0f3 and it seem the multi display input still isn’t working. I unlike the OP require inputs through both displays.
I am trying to have the gameplay on one monitor and UI on a touch screen. Does it work with Unity5.5, I followed multi display tutorial it says display length is 1 ? It doesn’t detect even my 2nd monitor.
Yes it should be no problem for what you want. What platform is this on? The editor will always say 1 display, but will still work, you just dont need to activate displays in the editor. The player should return the correct number of displays.
Hi Karl,
Thanks for the reply, it works fine. I notice some issues here.
The input doesn’t work properly, I need to find out which window actually accepts Input (lets say I have 3 Game tabs with Display Target 1, 2, 3). I want the input to be accepted on Target Display 1 consistently, which its not happening in editor. Also I cannot maximize the game tab that is attached with the editor when I have multiple game tabs open. In build exe it works fine ( to the best of my knowledge, did not intentionally test)
When I select a Target Display X, the UI on Target Display Y and Z gets changed , temporarily, I believe its trying to apply aspect ratio from the selected Target display on editor to all the game tabs?
Now that I have game on display Target 1, UI X on display Target 2 and UI Y on Display Target 3. I am trying to create a Common Canvas that Can be rendered on Both the Display Target 2 and Display Target 3, as these players share some common data. How Can I create a canvas and render it on 2 out of the 3 display targets I have? (Note: I am not using Camera for rendering UI, I do it all based on Screen Space overlay and I choose the display target)
Thank you for getting back to me so quickly. I appreciate that.
There is currently no way to do this. You need to either use a world space UI(and use layers to control who sees it) or have a canvas per camera.
The editor has limited multiple display support. It does not know which GameView/displayId is being interacted with so the input goes to all. Its just a case of something that needs more work on our end. The only platform that is currently able to distinguish between the input is Windows Standalone Player. We have plans to address this in the future.
Yes I think this is an editor only issue. Can you file a bug report?
Okay I’ll make a project and send it once I get home from work. I really hope we could duplicate canvas across multiple screens, that would be epic. As of now I created the world canvas for UI and I placed it before my TargetCam1 and TargetCam2 (which are at 0,0,0), so my 3D UI is kind of overlaying my 2D UI to get a common shared UI between the display targets. Now that VR Is getting popular, multi display is getting really common.
Hi there, is using UI with mouse input on multiple displays stable now?
I am currently running a 2 display build with main output on display 1, and a smaller different sized resolution output on display 2 that contains UI. The output looks fine, but I cannot interact with the UI on display 2. There are some scenarios where it may be possible that Display 2 is getting mouse clicks and propagating them to the UI, but the coordinates are completely wrong so I cannot interact with the UI.
The coordinate offset is just a hunch, I have only seen a symptom that seemed to suggest offset coordinates once or twice, and I cannot reliably repeat this.
Any insight on the status of multi display mouse input? Anyone else have this working properly?