We have some code that reduces the games resolution to 75% if it is over 2k in either dimension. So if you have a phone with 2560x1440 then it reduces the games resolution to 1920x1080. We did this for performance reasons to make sure the game runs fast.
The PROBLEM is that once the device does this the first time it somehow gets stuck in some sort of memory that the games resolution is 1920x1080, so it reduces it again. It is even doing it to devices that are already natively below 2k (we have a tablet that is 1900x1200) and it reduces its resolution too.
The CRAZIEST part in all of this is that If we push a build to the device with WiFi turned off the Resolution goes back to Native Resolution and the reduction works as it should. If you turn the device WIFI back on and push a build it goes back to starting at the wrong resolution.
At a guess I would say you should simply set the resolution based on the devices available resolutions. Do this each time the app loads. There is no real need to do proportional resolutions.
-if I build & run from Unity to android device with wi-fi On → game starts with lower 1440x900 resolution (WXGA?),
-if I turn wi-fi on device Off, and build&run the same project on device, installed game starts with Native resolution: 1920x1200 !
-simply change the name of package in Unity (even one letter), wi-fi state doesn’t matter, build&run on device, game starts also with Native resolution: 1920x1200 ; but game is published, can’t change package name.
What’s going on here? just curious: how wifi state is connected to device resolution?
Appreciate any tips on what I could check and debug in order to fix this strange behaviour!
p.s: using Unity 2018.4.16, latest version of Unity & third-party assets. I also have a script for HD/UHD atlas swapping, but this script fires when first scene loads, I see that resolution is wrong while splash screen shows…