Not sure if this is the correct forum to post in, since it’s not limited to, but does effect WebGL. Mods, feel free to move this if it’s not the right place.
Anytime you run a Unity game/app (we’re seeing this/have tested in both standalone OS X builds, and in WebGL) on a Mac retina display or on a windows machine with display scaling (for example, 4K monitor with 200% scaling), you will see an instant drop in framerate. This is easily testable with a standalone build running in windowed mode, or a WebGL build.
After doing a lot of investigation work, especially with WebGL builds it appears as if the resolution doesn’t actually change when on a retina display vs a non-retina display. You can drag the window from a Macbook Pro’s built in display, over to a 1080p monitor via HDMI, and see an instant drop in FPS. However, there’s no real visual difference or improvement when on the retina display. Unity is reporting that Display.systemWidth/Height and Display.renderingWidth/Height values haven’t changed when you drag the window from one display to another. Same with the WebGL canvas, when calling Javascript to spit out the current viewport resolution, no change. However, there’s always a drop in frames.
It’s worth noting that if you use a third party application such as SwitchResX (http://www.madrau.com/) to set a specific resolution on OS X (for instance, setting 1680x1050 resolution, NON-scaled / Non-HiDPI), you no longer see the drop in frames. But if you set it to the retina version of 1680x1050 (so that HiDPI scaling is enabled), you do see the drop in frames.
So in conclusion, anytime your system is using display scaling, Unity instantly performs slightly worse, even though it appears as if the game resolution is identical and visually it looks no different.
I understand how retina displays / windows display scaling works, and tried to combat this by detecting HiDPI displays and setting a lower resolution manually (since typically, everything is at x2 scale). However doing so just makes it looks significantly worse, since we’re down-res’ing but Unity isn’t actually up-res’ing in the first place like it appears to be because of the FPS drop.
Anyone else seeing this / have any kind of workaround?