Difference between Screen.SetResolution & UniversalRenderPipeline.asset.renderScale

I noticed that i can make Unity change the resolution by doing one of the following:

  • Screen.SetResolution(targetWidth, targetHeight, true); // Which is what i’m using.
  • UniversalRenderPipeline.asset.renderScale = 1;

Should we use UniversalRenderPipeline.asset.renderScale?

Another thing that i don’t know how it’s working: i’m using URP and everything works fine i guess, but in my Player Settings → Quality: there is no Renderer assigned in the Rendering section.

However it’s assigned in the Player Settings → Graphics though.

Should i assign it in Player Settings → Quality ?

Screen.SetResolution will change the display framebuffer, while renderscale only changes the dimension of the intemediate render target. using Render Scale you can achieve the effect that only 3d scene are rendered scaled down, but UIs are rendered with native resolution, but you need to modify urp a little bit to achieve this though

1 Like

Thanks, so if i want to reduce quality on older devices i should use Screen.SetResolution, right?

SetResolution scales everything down, while RenderScale only scales the major part of the rendering down.
With RenderScale you have the posibility to render something that’s very quality sensitive(like UIs), but rather cheap to render at full scale, while having the most part of the rendering(3d scene) scaled down to improve performance. For best performance you can use SetResolution without any problem

1 Like