LOD Bias and Max LOD in HDRP don't have useful quality-based values?

I’m very confused about how Quality should affect LOD in HDRP. There are four odd things I’m seeing with respect to setting up LOD Bias and Max LOD.

If you start a new HDRP project with the Sample Scene, the defaults are to use Quality-based LOD Bias and Max LOD values.

But if you look at the three different quality levels (Low, Medium, High), they all have identical values for all the LOD-related properties. That is, LOD Bias is “1”, and Max LOD is “0” for Low, Medium, and High. As I recall under Built-in RP, I’d expect Low and Medium to have different defaults. Is this just a case of defaults not really being useful? Or am I misunderstanding how I should be setting LOD Bias?

Additionally, why do the LOD Bias and Max LOD Level highlighted in the screenshot above each have Low, Medium, and High values? For example, what is the meaning of the “High” LOD Bias value on the HDRPLowQuality HDRP Asset? I initially thought this was so that you could use a single HDRP Asset for all Quality Levels, and there would be a separate field per quality level. But if I have 4 quality levels in my project, I still only see Low, Medium, and High under LOD Bias and Max LOD. So it’s very confusing what those three values represent in the HDRP Asset.

Next, having set some values for LOD Bis and Max LOD level on the HDRP Asset, I see that I can edit the values on the Quality screen for each quality level. However, those values completely disagree with what I entered on the HDRP Asset:

There you can see a Bias of 4 and a Max LOD of 2 for my High Quality settings. But those values aren’t the same as the ones on the HDRP Asset. And I seem to be able to edit either values independent of the others.

Finally, after all of this, if I look at the HDRP Global Settings, I see that it’s using “From Quality Settings”, but confusingly it lets me pick a Quality Level of Low, Medium, and High (Even though I have 4 quality levels in the project now). Changing the values in these dropdowns just continues to show a “0” in the LOD Bias and Max LOD:

9492238--1336234--upload_2023-11-25_11-41-49.png

This is all very confusing. The docs only very briefly touch on this, and it doesn’t explain how all of these various settings relate to each other. Is there ay better documentation on which of these various LOD-related fields should be populated, and which actually do something?

You never particular define how you’re understanding it for me to answer that, but if I have take a stab at how people typically misunderstand the framework.

Let’s start with quality profiles for the Pipeline Vs quality levels of a function.

Many often try to wrangle with forcing a setting to change between a function quality level via code when really you need to consider that you’re actually setting up a profile for a quality level of your hardware targets and then defining quality levels for each functions, it’s much cleaner to change the quality level that you set in the low med high function values then it is to manipulate those values for any reason ( not to mention that you’re tuning these for hardware targets, typically you wouldn’t want end user level control of this. And as for Profile quality levels these can also be changed and switched from a main menu during runtime ( not pause cause it all has to recompute l, might cause issues at runtime.

So if we take a look at the LOD bias and max LOD no matter the profile chosen the LOD change is uniform and consistent between this, and honestly, especially at the beginning you’d probably want this as otherwise you may overcompensate for early or late LOD changes because you or another team member changed it and didn’t communicate or did and you just didn’t internalise it.

So it’s best to think of this as a hierarchy since this whole set up is the same Framework as volume overrides you should already be familiar with it.

—Project Wide—
Master pipeline (like URP or HDRP)
|
|— HDRP Asset (profile)
| |
| |— HDRP Quality Level Profiles
| |
| |— Quality Levels (for specific functions within the profile)
| |

—Scene Wide—

| |— Global Volume Overrides
| |
| |— Local Volume Overrides

Prefabs / inspector etc

Your Quality levels can be chosen at scene level overrides, which enables you to cleanly expose these to end users or set yourself on a per scene basis.
This means that Cinematics for example can have the quality levels of a function or function(s) all switch to the higher quality one for that cinematic or per cinematic to bring out the best of it, this sits within the working range of the hardware target from the Quality level profiles too which is a great bonus.

As I mentioned above, the LODs are uniform for organising purposes as, you as a user are likely going to trip a lot if they were different, similarly starting at 1 ( normal and expected behaviour) you’ll then be able to compensate at different quality levels if needed.

As mentioned the Quality level profiles can really only be changed from a main menu when gameplay isn’t initialised so this means you can have multiple levels here for target hardware and do a global switch out but only then.

Quality levels are predefined ( like a precompute or preset that you expose to the user menus, these can be on main or pause menus is the same way you see “low med high”

For example
Shadows quality - low med high
You define that based in the HDRP asset in use.
Similarly Global override in a scene can be easily accessed by a user, maybe you want more granular control of this for a user, but it still only works within the constraints set but the previous hierarchy.

For the LOD bias it’s a multiplier so 1 is as you set it in the LOD group 0.5 is LOD Group * 0.5 etc. this means the LODs will change automatically based on camera distance with their respective bias making that happen sooner or later ( hence the confusion if unity had set this up for you)

So the two reasons why this isn’t set up an left default for you is because
You would other wise just get confused
Because every project and target hardware is different and may or may not need this adjusted.

The latter being up to you you tune for, and of course there’s no possibility Unity could know this for you.

Your second picture you’re just showing the Quality level chosen from the HDRP asset in use. This would indicate that you’re creating a default value for everything else to go from.

Your last picture is again from the perspective of your ACTIVE profile, this is based on your first images LOW MED HIGH setting.

If you’re getting LOW MED HIGH mixed up a lot where you’re unsure where the settings you’re charging are in comparison of the hierarchy there, consider naming your HDRP assets and you’re Quality levels different names.

It may help you make sense of things like

SuperComputer (asset)> Cinematic (quality level) > LOD bias (Low/Med/high)

You should then be able to visualise that you’re affecting the quality level of LOD bias for the cinematic quality level under super computer hardware target.

Sorry this is a little bit lengthy and yes some ironing out of the flow or at least identity of the project Settings can and is being looked at. But hopefully this helps you explore a little easier.

Oof… well I just tried to read the original post and the lengthy reply. My brain is hurting lol.

I actually still don’t see how the LOD settings from the classic Quality screens (2nd screenshot) relate to the Low/Medium/High LOD settings from the HDRP asset (1st screenshot). This relationship seems undefined.

Specifically:

If my current active Quality setting is “High Quality”, with LOD settings 4 and 2 as shown, and the associated HDRP Asset is “HDRPHighQuality” as shown (2nd screenshot), then…

…how do the HDRPHighQuality asset settings (High/Med/Low from the 1st screenshot, LOD Level and max LOD) relate to the 4 and 2 from the quality settings?

I’ve always just ignored these things and left them as default :slight_smile:

1 Like

I’m sometimes worth the headache. heh.
i have a feeling that the 4 2 LOD level was unintentional, as it’s removed in 2022+ but what this would have done is been a project wide override for that LOD bias set up, the Profile is still a project wide setting but lower than the Poject settings menus, think of those as the Master profile, the next ones down are the SRP assets.

the “Master profiles” can still override SRP assets hence you can have multiple quality levels for one asset. everything under the Current Active Quality Level is the “master” override for It’s render pipeline Asset

Quality levels sharing the same pipeline asset
9511051--1340371--upload_2023-12-4_20-13-24.png 9511051--1340374--upload_2023-12-4_20-13-50.png

The pipeline asset overrides
9511051--1340380--upload_2023-12-4_20-14-11.png

typically your override Global Settings under Graphics with the pipeline asset, then you refine that in your scenes with global or local overrides, you set the Quality levels as predefines that you can select at the override level ( low med high)

Since the OPs settings are from an older version ( i’m guessing 2021) and some of it seems a lot more straightforward now and less redundant.
i know Unity planned on trying to sort out the project settings because it is hard to navigate ( and explain). but it’s just best to think of the hierarchy, or volume framework when you’re navigating it, and it makes it just that little bit easier.

for the ops versions, to better understand if the 4 , 2 LOD bias area is affecting the uniform 1 1 1 LOD bias, just put something in the scene with lods and move back and forth, since LOD bias is a multiplier, you should see it double/half etc the distance based on your settings. you’ll be able to tell which f the 2 settings has priority this way.

1 Like