Hello,
I’ve been looking for hours and just need some help. I have built my game at 1280x720. To closely maintain the pixel art look on full screen upscale, I have set the “player settings” default screen width and height to 1280x720.

When I put out a build, both for a Mac and PC, I get a nice fullscreen upscale that has a slight blur / interpolation applied that makes the game look the way I want. I have tried it on different Monitor resolutions (1920x1080 and 2560x1440) and the upscale interpolates nicely on all sizes.
But, a few Beta Testers have returned screen shots of the upscale not applying blur / interpolating on upscale, and the results are very bad for my game. Pixels are swimming and flickering as they would when no interpolation or filter is smoothing things out trying to upscale.
What is troubling, is that this is the same build. And it has happened on multiple testers resolutions, also. Now, they COULD set their monitor resolution to 1280x720 manually, but we want to utilize the upscale blur.
So, I really hope someone can help me understand what might be going on under the hood:
• Why would the same build upscale with blur and others not?
• Does Unity default to a certain filter / interpolation when upscaling to full screen?
• If so, where would I access the filter / interpolation settings? Would these be set in the Windows PlayerPrefs (as that would explain why some computers are fine while other not)?
• Or is something else going on completely? I’m at a loss right now.
Very much appreciate any help that anyone can give on this.



