Multi-Display Canvases Not Working in 5.4.2

Despite saying this issue was resolved in 5.4.2, I’m unable to register input across displays. I currently have 3 displays in one scene, with UI elements being rendered in camera space for each display. The UI Event system is only registering on one display. Switching to the other two displays in the game tab does not effect the UI Event System.

For example, the Event System registers a button click on Display 2. When I switch the game tab to Display 3, the Event System still registers all clicks for Display 2.

Any help/ work arounds would be greatly appreciated.

EDIT: This users issue describes my problem exactly, if this helps illustrate the bug in question:

EDIT 2: Changing the Canvas to screen space and specifying which display to render on does not resolve the issue. Changing the Canvas sorting layer to be the highest value sets that particular canvas to be “active” allowing for interaction.

What OS is this on? Can you file a bug report with an example project and post the number here?

Absolutely. Case 849146. The game was developed on Mac OS X El Capitan 10.11.5, but built and deployed for Windows Stand Alone. When testing it on the Windows machine, it was only registering input clicks on one screen, and not intercepting button presses on the other screens, as seen in the editor. Slightly different bug, but presumably related.

Thanks. I suspect the issue is with Display.RelativeMouseAt. This is a platform specific function that is currently only implemented on Windows player. I do not believe we have an editor implementation of this currently. This bug report should help get the ball rolling.

Glad to help. In the meantime, is there a workaround? Or will developing natively on the windows machine resolve the problem?

Developing natively on the windows machine should give you the best results, I know its a bit of a pain having to rebuild each time though. You could also try and create a wrapper around Display.RelativeMouseAt to do it in the editor. I’ll have to have a think how this will work though, its likely how the final fix will work.

Did a quick test on a Windows machine, placing a UI button in each of three displays. Activate said displays via script. The default event system still only registers clicks on one screen. I don’t know if this is excepted behavior, but I thought I’d let you know. Otherwise, I’ll write some input interception code using the relative coordinate method you mentioned earlier. As far as I can tell, the current multi-display implementation isn’t documented thoroughly. I’m getting hung up on unspecified implementation requirements. Any additional information the Unity team could provide about how to get the internal event system to register on all three screens would be great. Thank you for your effort to fix this issue!

I just knocked up this editor emulation script. Its only had a few seconds of testing but worth trying
It basically finds the active gameview and returns the display ID for it.

    Vector3 RelativeMouseAt()
    {
        System.Reflection.Assembly assembly = typeof(UnityEditor.EditorWindow).Assembly;
        Type type = assembly.GetType("UnityEditor.GameView");

        var lastGameViewField = type.GetField("s_LastFocusedGameView", BindingFlags.Static | BindingFlags.NonPublic);
        var gv = lastGameViewField.GetValue(null);

        int displayID = 0;
        if (gv != null)
        {
            var displayField = type.GetField("m_TargetDisplay", BindingFlags.NonPublic | BindingFlags.Instance);
            displayID = (int)displayField.GetValue(gv);
        }

        var pos = Input.mousePosition;
        pos[2] = displayID;
        return pos;
    }

I have been working on writing a blog post on the multiple display system and its finer points as well as helping to clean it up and get it into better shape.

That’s strange. We re enabled multiple display support for UI in 5.4.1p2 so any version after that should work. Is RelativeMouseAt working correctly on the windows builds?

Taking a look at it now. Having some compilation issues (assuming it’s C#). Do I need to specify any additional API’s for this code? Or is this JavaScript (it doesn’t seem to compile as a JavaScript file either)?

About the Windows issues, I’m currently using 5.4.2f2 on the windows machine. I have three cameras and three canvases. Each canvas is assigned to its unique camera, each camera is told to render to its own unique display. The actual display renders properly on each screen. When I bring the mouse to screen 1 or 3, no input is registered. Input only is received on the main screen itself.

The code is c#.
Maybe you are missing a using?

using System;
using System.Reflection;
using UnityEngine;

Here is an improved emulation script that does not require the GameView to be active

    Vector3 RelativeMouseAt()
    {
        var mouseOverWindow = EditorWindow.mouseOverWindow;
        System.Reflection.Assembly assembly = typeof(UnityEditor.EditorWindow).Assembly;
        Type type = assembly.GetType("UnityEditor.GameView");

        int displayID = 0;
        if (type.IsInstanceOfType(mouseOverWindow))
        {
            var displayField = type.GetField("m_TargetDisplay", BindingFlags.NonPublic | BindingFlags.Instance);
            displayID = (int)displayField.GetValue(mouseOverWindow);
        }

        var pos = Input.mousePosition;
        pos[2] = displayID;
        return pos;
    }
1 Like

Yea I was missing the Reflection include. I’ll give this a shot when I’m back near the windows machine and report back in about thirty minutes. Thank you Karl, you’ve been a great help.

1 Like

Karl,

Your code works great in-editor.

For further detail on the event input bug when built: the event system registers input on Display One, but for input on the UI of Display Two. Very bizarre. Clicking on UI elements from Display Two or Three does not register any input whatsoever. The build is in 4.5.2 on Windows 7.

Glad there is already a thread about this.

I found out that multiple canvases on multiple displays work, but only if the canvas is set to Screen Space - Overlay. As soon set the canvas on the second (or third, etc…) display to World Space, the input on the non-primary display is not recognized at all.
This only happens in a build, in editor it also works with World Space mode.

Karl, is it intended, the input only works in Screen Space - Overlay?

Thanks for your help in this matter.

No that sounds like a bug. Could you file a bug report please.

Case number: 852409

1 Like

Hi Karl,

the bug was immediately confirmed on that day. Now, it says “Fixed in future release”… unfortunately, it’s not fixed in the current 5.5.0f3.
Is there any more detailed info on this? Is it easily fixable or do we have to wait another major release?

It should be possible for the fix to go into a patch. I will check tomorrow.

Did you check? :slight_smile: