Multi-Display rendering offset when displays have different resolutions

Hi Everyone,

I am creating a game (it’s actually more of an application/useful software than a game) which needs to be run on multiple monitors, where all monitors are potentially different resolutions.

In the example below, you can see the primary monitor (far left) rendering a full screen camera view, just as it should. There are two other cameras set to render to monitor 2 and 3. These are activated with Display.displays [1].Activate (1600, 900, 50); (so as to set their resolutions to 1600x900).

The issue is that Unity appears to be offsetting the camera rendering so that it does not originate from the top left of the screen, but rather some way down the window. The grey area shows the area where the image is missing. It has seemingly been moved downwards rather than cropped.

When running in the editor, the cameras render each window perfectly with no strange offsetting/cropping.

What do I need to do in order to get the standalone output to render the correct, uncropped/offset image in each of the windows, please?

If I make the resolution of all monitors the same, it renders exactly as it should do:

I just found this issue in Unity 2019.1.2f1. I don’t understand why it’s not fixed yet. If all my monitors are set to the same resolution, there is no problem. But if any monitor is set to a different resolution, then it incorrectly renders the image for that monitor using the resolution of the main display.

I’m having exactly the same problem. I’m using the “-multidisplay” command line option (which sure feels like a hack, and isn’t very portable).

You can find that option on the bottom of this page:

As @stonesand confirms, this is a known bug to Unity developers, there’s nothing much to do about it.

I’ve just downloaded the beta version of Unity and this seems to be working properly now. You should give it a go.

I’m on 5.4.1f1 and I still have this problem, if I use any image effects on either camera.