I’ve been messing around with a second Display. My initial problem was that Unity forces both my displays into fullscreen. As I am currently happy to only support windows, I have managed to achieve what I wanted using user32 api stuff. Painful as I can only test in release mode. After a lot of work I finally achieved what I was going for, but I’ve noticed the resolution of the second display is off.
Now I am trying to render into a very large window (6400x2560) and using Display.SetRenderingResolution does not work, it get’s clamped to 1920x1080 (the resolution of the monitor it start in). Setting a resolution of 100x100 worked fine. The one thing that hurts me the most, is that this is completely undocumented and I’ve spent countless hours trying to solve this problem, checking shaders etc.
The thing is, it worked in the editor, so it’s not a technical/hardware limitation. It’s just that the editor-windows dont have this clamping limitation. I’ve spent a few days trying to get the whole thing to work in release
Unfortunately the Display class was never meant to be used the way you’re trying to use it. Unlike what you might think, it is not a generic way to spawn multiple windows with rendering, size and style parameters that you want. It was mainly designed to be used in fullscreen mode, 1 window per display. Think of kiosks or arcade machines. As much as I hate to say this, but it’s very likely you will run into issues if you go off the “blessed path”.
As far as I can tell, Display.SetRenderingResolution is not implemented on Windows. Are you sure it works with 100x100? Secondly, are you still spawning multiple windows, or are you trying to stretch the main window across several displays?
I was 100% sure it worked with 100x100 … now that percentage has gone down to 99%. I could have also been using Display.setParams(w,h,x,y). setParams is documented as Windows only, but there is no info about SetRenderingResolution not being implemented. hmmm.
Testing is very tedious as I always need to rebuild and run. I’m spawning a second display and stretching that over all 3 screens. It all works quite nicely besides setting the resolution.
Looks like I will have to settle for a single-display wallpaper, probably the most likely usecase anyway. I just really loved the 3 display version that works perfectly in the editor.
edit: I’ve strongly regretted going off the sacred path, but I really wanted to make this work…
You can stretch the main window over 3 displays by just doing “Screen.SetResolution(totalWidth, totalHeight, FullScreenMode.Windowed);” and then starting the player with “-popupwindow” parameter. No need to spawn secondary windows.
This looks very very promising. The main display does not suffer from the cropped resolution. I can use the second display as the new main display. I will have to adjust all my code that relies on Screen.width/height but that should not be too big of an issue (I hope).
Unfortunately I am running into multiple new issues. The most critical one is that UI is no longer interactable. Canvas works in no cases (renders, but does not interact). I’ve tried setting targetDisplay (I know it’s only for overlay). I’ve removed and added the cameras, switched around rendering modes.
My own UI actually seemed to work, but only in Fullscreen mode for the second display (what I know is the intended purpose). In windowed mode, the mouse position is as if it was positioned on the large wallpaper.
Minor issues are that the game mostly start with a huge window + the second display can get squashed. Might have to give up on the dream, or at least think of something new…