Which is the most reliable lightmap encoding for target platform settings or lightmap compression for lighting settings?

Hello! Brothers, I have a problem in my project, the problem between the lightmap encoding in the player settings and the lightmap compression in the lighting settings, generally we try to keep the two in sync, for example, the lightmap encoding is set to normal quality, and the lightmap compression is also set to normal quality, so my question is, if I say that the lightmap encoding is set to high, but the lightmap compression is set to low, then in the end, for the target platform, what kind of encoding format is it, as far as the Android platform is concerned, It has LDR, RGBM, HDR,

As you probably know, encoding and compression are 2 separate concepts. Encoding is about which values to put into each texel, and compression is about which hardware format to use for the texture.

Generally, the rules for lightmap encoding are like so in current Unity:

High quality = Full HDR if the platform supports it, otherwise RGBM
Normal quality = RGBM
Low quality = dLDR on mobile platforms, RGBM otherwise

What each of these encoding schemes mean is documented here Unity - Manual: Lightmaps: Technical information

The rules for which format is used (i.e. compression setting) are a bit more involved. They are documented for some platforms here Unity - Manual: Recommended, default, and supported texture formats, by platform

If using dLDR or RGBM encoding, read the “RGBA” row of the table. If using full HDR, read the “HDR” row of the table. For example, on Windows, Low quality encoding + High quality compression = BC7.

My question is, if you use a compression format that doesn’t support HDR encoding (although HDR doesn’t do any encoding), that means that the encoding settings don’t make sense? For example, on the PC side, I set the encoding quality to high, but my lightmap compression ratio is set to normal, that is, the Dxt5 format, it should not support hdr encoding, so even if I set the high-quality hdr encoding, it will still use Rgbm encoding, right?

This is related to a plugin project I am currently working on, and I hope to be able to get an accurate response from you, thank you

You are correct that putting RGBM encoded data into DXT5 wouldn’t make sense. In cases where the high quality encoding is selected (and the platform supports native HDR), you will always end up with an HDR format. We never write incompatible data into a texture - if you see that it’s a bug. The encoding essentially overrules the choice of compression.

The final format will be with encoding quality = high on a platform with native HDR support will be

on PC:
Uncompressed or BC6H, depending on choice of compression

on mobile platforms with ASTC support:
Uncompressed or ASTC with varying block size depending on choice of compression

on other platforms:
Uncompressed or RGB9e5

Are you facing some specific issue with texture formats, or just looking for advice? It’s a bit tricky to exhaustively list the final format with every permutation of platform and project setting, as it’s essentially a pile of heuristics that does its best to choose a suitable format, and changes between versions. You shouldn’t ever end up with a incompatible data being written into a texture, though.

Is this sentence a mistake you accidentally typed? Should it be HDR?“You are correct that putting RGBM encoded data into DXT5 wouldn’t make sense.”

Thank you so much for taking the time out of your busy schedule to answer my questions, the reason I ask them is that I need to determine how the lightmaps are encoded on the current platform so that I can decode them, and while Unity implements the correct lightmap decoding in the render pipeline’s library, I currently use the CPU for decoding in the editor for some reason, and according to you, I went to my project to try it further, and the end result is that lightmap encoding always ends up overriding the results of lightmap compression, No matter how I set up the lightmap compression, as long as I set the lightmap encoding to the appropriate quality, it will automatically convert to the current encoded format. This is very gratifying. Unfortunately, I can’t read the LightMap Encode parameter in the current player settings from my code, and Unity doesn’t seem to have it turned on. But one last point? Since lightmap encoding basically overrides the decision of lightmap compression, it doesn’t seem to make much sense to set the lightmap compression parameters because the results don’t seem to matter!

Is this sentence a mistake you accidentally typed? Should it be HDR?“You are correct that putting RGBM encoded data into DXT5 wouldn’t make sense.”

Yes, my bad.

it doesn’t seem to make much sense to set the lightmap compression parameters because the results don’t seem to matter!

It does matter - it still affects the final hardware format chosen for the texture. For example, on PC, if you select lightmap encoding = normal, and lightmap compression = low, you get DXT5 with RGBM data. If you set encoding = normal, but lightmap compression = high, you get BC7 with RGBM data.

What I’m saying is that encoding takes priority, and that you will never get an incompatible format. If we choose normal encoding quality, meaning RGBM, that limits the lightmap compression setting to selecting from different formats where storing RGBM data makes sense.

If you just want to figure out the final encoding used for a texture in editor, we do actually have a few functions for it here UnityCsReference/Editor/Mono/AssetPipeline/TextureUtil.bindings.cs at master · Unity-Technologies/UnityCsReference · GitHub they are currently internal, but you could use reflection to access them. It’s also possible to extract the same info out of the texture importer with the SerializedObject API.

Thank you very much for your patient answers, which has benefited me a lot!