Help with Fullscreen Upscale not applying Blur / Filter

Hello,

I’ve been looking for hours and just need some help. I have built my game at 1280x720. To closely maintain the pixel art look on full screen upscale, I have set the “player settings” default screen width and height to 1280x720.

7565623--936445--Player Settings.png

When I put out a build, both for a Mac and PC, I get a nice fullscreen upscale that has a slight blur / interpolation applied that makes the game look the way I want. I have tried it on different Monitor resolutions (1920x1080 and 2560x1440) and the upscale interpolates nicely on all sizes.

But, a few Beta Testers have returned screen shots of the upscale not applying blur / interpolating on upscale, and the results are very bad for my game. Pixels are swimming and flickering as they would when no interpolation or filter is smoothing things out trying to upscale.

What is troubling, is that this is the same build. And it has happened on multiple testers resolutions, also. Now, they COULD set their monitor resolution to 1280x720 manually, but we want to utilize the upscale blur.

So, I really hope someone can help me understand what might be going on under the hood:

• Why would the same build upscale with blur and others not?

• Does Unity default to a certain filter / interpolation when upscaling to full screen?

• If so, where would I access the filter / interpolation settings? Would these be set in the Windows PlayerPrefs (as that would explain why some computers are fine while other not)?

• Or is something else going on completely? I’m at a loss right now.

Very much appreciate any help that anyone can give on this.

SOLVED! Let me tell you a story about the Windows Registry Editor / PlayerPrefs.

So, when you run a unity program for the first time, the program will store PlayerPrefs in the Windows Registry Keys (HKCU\Software[company name][product name] key). This can be accessed by typing “Registry Editor” in the windows search bar.

Now, say you make a new build of your game, and new resolution settings have been added/changed. When you run the game, Unity will NOT write over the PlayerPrefs with the new resolution settings. It still uses the old settings that it wrote the FIRST time you ran the program. This means that resolution changes will not be seen or applied. In my case, this is what was happening. New changes in my builds were not being seen because of the PlayerPrefs were using old settings (ones that weren’t set to upscale and interpolate using Unity’s Resolution and Presentation Settings).

SOLUTION: Uninstall my game. Go into the PlayerPrefs for my game, and delete everything in there that can be deleted (only one file can’t, as it is a base default file for all registries). Re-install my game, and run it. This way the game will write the PlayerPrefs as though it were running the program on the computer for the first time. Now, it was important to un-install, as just deleting the PlayerPrefs only didn’t do the trick. It only worked when we un-installed AND deleted the PlayerPrefs.

This is been a hectic search, but so relieved to understand what was going on. I hope this helps anyone else that might run into this or similar issues.

Take care everyone!

1 Like

SOLVED CONTINUED!

So, the issue seemed to persist after we released Rogue Invader. We saw constant upscale issues on various streamers and Youtube videos. Aggravating to say the least.

I reached out to a streamer and asked to look over their setup and try to troubleshoot the issue. We tweaked Graphic Card settings (Nvidia) and Properties in the .exe itself. No change. But I did see a discrepancy in the Player Prefs in the registry editor (HKCU\Software[company name][product name] key). Here is what I saw:

So, for some reason, the Use Native was set to ON (1) when in the build settings we have it turned OFF (0). Now that I know what I was looking for, I looked up some other forums and they mentioned that this looks like a bug of some sort.

Thankfully, there is an EASY fix. On start, you can set the screen resolution by using this 1 line of code:

using UnityEngine;

public class SetScreenResolution : MonoBehaviour
{
void Start()
{
Screen.SetResolution(1280, 720, true);
}
}

This should force the Use Native player pref to 0 on start, and then set the width and height to the desired size (“true” is for fullscreen, which we wanted as it applies the upscale interpolation blur)

Now, you can ADD the “PlayerPrefs.SetInt” lines to set the 3 problem values for extra safety, but realize that they are written to the prefs when you QUIT the program, not on open (don’t know why). Use the following code:

using UnityEngine;

public class SetScreenResolution : MonoBehaviour
{
void Start()
{
PlayerPrefs.SetInt("Screenmanager Resolution Use Native", 0);
PlayerPrefs.SetInt("Screenmanager Resolution Width", 1280);
PlayerPrefs.SetInt("Screenmanager Resolution Height", 720);
Screen.SetResolution(1280, 720, true);
}
}

This should force your program to open and set the prefs to the desired values as Unity build settings for DON’T USE NATIVE RESOLUTION are not 100% reliable. Been months hunting this down, and glad to share the findings.

Hope this helps others.

Looks like good information but whilst I can see you’re writing a 2D game, this doesn’t seem like a 2D specific thing and would seem best posted on General Graphics.

I can always move your post there if you wish.

By all means, please. Put it wherever you think would be most helpful. It was just my game was a 2D looking game, but yeah, it is definitely a general issue.

Thanks!

1 Like

There’ll still be a redirect link from here btw. :slight_smile: