I have encountered a bug by which if you build with URP instead of the built-in render pipeline, a multi-monitor setup will see the secondary monitor image stretched.
The build is for a Windows Standalone x86 64bit (Mono).
Specifying the resolution has no effect. It’s exactly like simply calling:
Display.Activate();
When building a standalone application with the built-in renderer, everything works well out of the box, the displays are already set in Windows at their default (and only) working resolution. The app fills them in fullscreen mode and calls it a day.
My guess is that there is some rendering to texture going on under the hood and the aspect ratio is not picked correctly.
How would I go about troubleshooting this?
On the Camera linked to Display 2, the aspect value is equal to the MainCamera, yet this doesn’t seem to have any effect, even if I set it to the correct aspect.
Checking:
Camera.rect
Returns a correct vector (in my case: 0, 0, 1600, 800).
The only thing I can think of is an issue with transform matrixes in the rendering to texture under the hood.
I understand URP does it anyway automatically, in order to manage the resolution dynamically in order to protect the refresh rate from falling.
Hi @Ascanio1980 , I am about to try multi-display with URP. Multi-display, in general, over the last several years has been unstable with display scaling having issues and other problems. I am going to try and use the latest version in hopes of getting to the next LTS. Could you tell me where things came out with this for you? Are you using the same workaround today?