You never particular define how you’re understanding it for me to answer that, but if I have take a stab at how people typically misunderstand the framework.
Let’s start with quality profiles for the Pipeline Vs quality levels of a function.
Many often try to wrangle with forcing a setting to change between a function quality level via code when really you need to consider that you’re actually setting up a profile for a quality level of your hardware targets and then defining quality levels for each functions, it’s much cleaner to change the quality level that you set in the low med high function values then it is to manipulate those values for any reason ( not to mention that you’re tuning these for hardware targets, typically you wouldn’t want end user level control of this. And as for Profile quality levels these can also be changed and switched from a main menu during runtime ( not pause cause it all has to recompute l, might cause issues at runtime.
So if we take a look at the LOD bias and max LOD no matter the profile chosen the LOD change is uniform and consistent between this, and honestly, especially at the beginning you’d probably want this as otherwise you may overcompensate for early or late LOD changes because you or another team member changed it and didn’t communicate or did and you just didn’t internalise it.
So it’s best to think of this as a hierarchy since this whole set up is the same Framework as volume overrides you should already be familiar with it.
—Project Wide—
Master pipeline (like URP or HDRP)
|
|— HDRP Asset (profile)
| |
| |— HDRP Quality Level Profiles
| |
| |— Quality Levels (for specific functions within the profile)
| |
—Scene Wide—
| |— Global Volume Overrides
| |
| |— Local Volume Overrides
Prefabs / inspector etc
Your Quality levels can be chosen at scene level overrides, which enables you to cleanly expose these to end users or set yourself on a per scene basis.
This means that Cinematics for example can have the quality levels of a function or function(s) all switch to the higher quality one for that cinematic or per cinematic to bring out the best of it, this sits within the working range of the hardware target from the Quality level profiles too which is a great bonus.
As I mentioned above, the LODs are uniform for organising purposes as, you as a user are likely going to trip a lot if they were different, similarly starting at 1 ( normal and expected behaviour) you’ll then be able to compensate at different quality levels if needed.
As mentioned the Quality level profiles can really only be changed from a main menu when gameplay isn’t initialised so this means you can have multiple levels here for target hardware and do a global switch out but only then.
Quality levels are predefined ( like a precompute or preset that you expose to the user menus, these can be on main or pause menus is the same way you see “low med high”
For example
Shadows quality - low med high
You define that based in the HDRP asset in use.
Similarly Global override in a scene can be easily accessed by a user, maybe you want more granular control of this for a user, but it still only works within the constraints set but the previous hierarchy.
For the LOD bias it’s a multiplier so 1 is as you set it in the LOD group 0.5 is LOD Group * 0.5 etc. this means the LODs will change automatically based on camera distance with their respective bias making that happen sooner or later ( hence the confusion if unity had set this up for you)
So the two reasons why this isn’t set up an left default for you is because
You would other wise just get confused
Because every project and target hardware is different and may or may not need this adjusted.
The latter being up to you you tune for, and of course there’s no possibility Unity could know this for you.
Your second picture you’re just showing the Quality level chosen from the HDRP asset in use. This would indicate that you’re creating a default value for everything else to go from.
Your last picture is again from the perspective of your ACTIVE profile, this is based on your first images LOW MED HIGH setting.
If you’re getting LOW MED HIGH mixed up a lot where you’re unsure where the settings you’re charging are in comparison of the hierarchy there, consider naming your HDRP assets and you’re Quality levels different names.
It may help you make sense of things like
SuperComputer (asset)> Cinematic (quality level) > LOD bias (Low/Med/high)
You should then be able to visualise that you’re affecting the quality level of LOD bias for the cinematic quality level under super computer hardware target.
Sorry this is a little bit lengthy and yes some ironing out of the flow or at least identity of the project Settings can and is being looked at. But hopefully this helps you explore a little easier.