Maybe a simple question about DLSS and setting screen resolution...

I was looking at allowing players to optionally turn on DLSS. This seems like a pretty simple thing, but there’s something I haven’t really figured out yet.

Let’s say my monitor supports 1920x1080 resolutions, and that’s the resolution I’m running the game at. If I turn on DLSS, should I then change the camera resolution down to 960x540, and let the DLSS do some magic? Or, do I leave the resolution at 1920x1080, and hope that behind the scenes, Unity’s actually rendering at a lower resolution then upsampling?

You would leave the resolution at 1920x1080. Then you can set the internal, lower resolution to be upscaled from as a percent of that. You’ll see it in the hdrp dynamic resolution settings. You can also change the resolution percent at runtime with this:

UnityEngine.Rendering.DynamicResolutionHandler.SetDynamicResScaler

Hmm. I see this on my HDRP Asset:

But per the info message above it, it seems that’s being ignored.

Or is it being overridden by the Use Optimal Settings checkbox above that? I guess I’m not sure whether I should be manually adjusting something like “SetDynamicResScaler”, or whether whether the system is smart enough to get performance gains automatically.

That’s right, “Use optimal settings” means that dlss will choose what percentage to render at during runtime. I’d try it that way and see if you like the results.

Otherwise, you can use unity’s frame timing system to detect when you need to increase or decrease the render scale yourself.