I noticed that in the standalone player, if I select a resolution lower than the native one, then the game doesn’t really run at that resolution, the resolution is still the native one. I suppose that, under the hood, the scene is rendered on a smaller offscreen texture and then that texture is scaled to the whole screen.
E.g. I have a 3840x2160 screen. If I select 1920x1080, the resolution of the application is still 3840x2160, but the scene of the game is rendered with a resolution of 1920x1080 and then up-scaled to 3840x2160.
Here is the thing I’m interested in. The upscale between the offscreen framebuffer (1920x1080) and default framebuffer (3840x2160) is made via a bilinear filter. Instead I’d like to use a nearest/point filter. Is that possible?
My platform is Windows 10, I don’t know if on the other platforms the whole thing works the same.
Thanks for any help.
The solution was very simple. I thought that Unity allowed to run the game only in the “Borderless Windowed Mode”, that is a fake fullscreen. In that mode the game run at the highest possible resolution (or maybe the resolution of the OS) and, when you select a different resolution in the resolution dialog, the scene is rendered on an off-screen framebuffer; finally it’s upscaled to the maximum resolution of the screen. This upscaling wasn’t really a good thing for pixel art, because it ruined the pixel perfection applying an unwanted linear filter.
But Unity also allows the REAL fullscreen. In “Edit → Project Settings → Player → Resolution and presentation → Resolution” there is a “Fullscreen mode” setting. The option “Fullscreen window” is the fake fullscreen. The option “Exclusive fullscreen” is the real fullscreen. With that enabled there is no unwanted linear filter even selecting a lower resolution.