Bloom effect requires adjustments for any supported screen resolution!

Scene looks as the developer intended only for the screen resolution used during adjusting Bloom settings.

I made some screenshots to show you how Bloom effect behaves for different resolutions with the same Bloom settings. The greater the resolution, the smaller the glow.

It might be better for you to see the difference if you to download all screenshots and look at them one after another with the Image Browser on your PC.
6220308--683961--01 BloomSettings.png



I know, I can try to adjust bloom settings for all supported resolutions and interpolate between them.

But all game developers using bloom would be happy if URP developers would implement some kind of formula that would require bloom settings for a single resolution, and would recalculate bloom settings for other resolutions.

3 Likes

I usually don’t like to dig out old threads from the grave, but it’s been almost a year and this problem still persists.

So, did anyone ever find a reliable workaround for this? I’ve been trying to replicate a reasonable scale but even when combining intensity and scattering, the effects look inherently different between say HD, 4K and 8K and you can just get a so-so approximation.

This is a logical technical issue
The problem is on low-resolutions that make bloom more blurred

I would understand if this affected only lower resolutions, but the effect is continuous. So you can have a perfectly well looking slight bloom effect around objects at HD resolution which then diminishes drastically at 4k and 8k. If you’re making a game for limited range of resolutions, that’s probably fine, but otherwise it can become a problem - especially if you’re making a stylized game that heavily relies on glow effects (lasers, light sabers, etc)

Use scripting to change bloom intensity for different resolutions

That is what I have been trying to do. Without much success.
If you have a very bright emission, you can turn up the intensity to resolution. Works fine.
But if an object should only emit a faint glow, changing up the intensity will only make it brighter and it’ll eventually end up with a very harsh effect while the former halo around it entirely disappears. I tried to compensate this with scattering, but I have either not found the right formula to keep the effect representable or it’s simply not possible with this particular implementation…

This is definitely an issue for any production environment. It is hard to achieve consistency.

my game graphics are exclusively bloomed lines.
I’m on Unity 2020.1
I just realized after 4 months of release that my game looks very different on screens with different resolutions than mine.

In my version of Unity “Scatter : Set the radius of the bloom effect in a range from 0 to 1.”
So 0 to 1 what ? percent ? inch ? screen width ?

Meanwhile in Unreal Engine 4 “Bloom size : The size in percent of the screen width”

The problem is that it downscales the buffer to say half the native resolution and then blur with fixed pixel width (which you need to, because blur starts looking weird if you do very high width kernel blurs).

So as the resolution of that bloom buffer changes, so does the size of bloom.

The solution is to sort of always downsize your buffer to a fixed resolution and then do the blur. That would fix the issue. Things might look a bit worse, since arbitrary downscales typically look worse than half / quarter etc

You may want to have it handle doing it in multiple passes if the starting res is too high for better results.

I don’t know how to dig into this, but for now, in my case I’m going to just lock screen resolution.

How is bloom done for different resolutions in other game engines like Unreal or Godot? Is there any pratical fix at all?

Hello, I’m having this issue with high resolution screenshots. I’m taking these screenshots at x8 FullHD so that I can then downsample and get better antialiasing and details.

In this scenario, the bloom is broken because the glowing distance is divided by 8. Upping the intensity and/or scatter accordingly will not give the same result. We need a way to set the scatter in screen space and not in pixel space and have the bloom resolution independant.

The depth of field is also affected by this issue.

1 Like

Hi,

How does it differ ? Can you post an example comparisson images if possible ?

Of course. The first screenshot is in FHD (1080p) and second is in 8K (4320p).
9934032--1437831--upload_2024-7-11_0-48-26.png
9934032--1437834--upload_2024-7-11_0-48-46.png

These are simply take from the game view, just by changing the rendering resolution. Changing the bloom parameters won’t allow to get the same exact result in any mathematical or trila/error way. Bloom effect has clearly been developped with a single resolution in mind I’m afraid.

Same goes for the depth of field, and it’s even harder to calibrate depending on the resolution. I suspect it may also be the case for motion blur and ambiant occlusion but I didn’t test those.

9934032--1437828--upload_2024-7-11_0-48-5.png

It’s because they are doing the blurring with a pixel offset, so as resolution increases the blur / bloom gets finer.

Either the pixel offset needs to get calculated as a percentage of the screen (which might look a bit odd, since blurring algorithms often depend on pixel offsets being really specific for best results), or it needs to downsample to a fixed, specific resolution, before doing the blur / bloom.

Why Unity traditionally refuses to do either of these solutions is anyone’s guess.

Did you ever discover or create a solution for this? I’m having the same exact issue.

I used this asset: MK Glow - Bloom & Lens & Glare | Fullscreen & Camera Effects | Unity Asset Store

Description says:

Resolution scaling: Same outcome independent from the screen size and resolution.

It’s not 100% true, but it’s certainly better than bloom developed for URP.

I recently finished a complete bloom end lens flare postprocessing stack. The problem is when using a variable downsample resolution for it to look the same you would have to resize the texelsize proportional to some reference resolution like 1080p. In theory this works but another problem is texel alignment, for resolutions that are multiple of each other this might work for the most part but that’s where flickering usually starts to occur. What I did was to set the reference resolution to 1080p and scale all my downsample textures based on that resolution, which makes the effect look the same no matter the screen resolution. Using 1080p is usually enough for lens effects while also achieving consistent performance accross different screen resolutions

With something like this you could always get a (somewhat) consistent texelsize while sort of reducing flickering:

static const float4 Base_TexelSize = float4(1.0 / 1920, 1.0 / 1080, 1920, 1080);

float2 AlignToTexel(float2 texcoord, float2 texelSize)
{
    return texelSize * round(texcoord / texelSize);
}

float2 ScreenProportional(float2 texelSize, bool align)
{
    float2 scale = _ScreenParams.xy * Base_TexelSize.xy;
    float2 result = texelSize * scale;
    if(align)
    {
        result = AlignToTexel(result, texelSize);
    }
    return result;
}

But just using a fixed downsample texture size gives the best results.