In the HDRP asset, enable FP16 for Color Buffer Format, and also enable higher precision buffers for the color grading, that’s Grading LUT Format and Grading LUT Size. This will have a memory cost.
Also, lower-quality monitors can amplify banding. If you see strong banding in this test (Gradient (banding) - Lagom LCD test), then one of the root causes is your monitor/its calibration. Run colorcpl.exe, and check that you don’t have some odd color profiles loaded.
With a higher-end, well-calibrated monitor, you should get near-zero banding on the image gradient below.
If after all these steps, you still encounter banding and you are certain the issue doesn’t come from the monitor, please file a bug, with a repro scene.
@pierred_unity
I have the same issue but in the URP (Unity 2021.2) but can’t find the “enable FP16 for Color Buffer Format” setting, is it possible in URP?
I’m using URP, but these did not make a substantial difference for me - and the HDR grading mode did not work (a warning showed up next to it when I checked that option, likely because my monitor does not support HDR rendering).
The previously mentioned “enable dithering” setting in the camera inspector is what fixed it for me.
@theghostronaut @KarlKarl2000 @SniperED007
I fixed my banding issue with URP by changing the HDR COLOR BUFFER PRECISION to 64Bits (instead of 32Bits)
To find this setting you need to be in ‘Debug’ mode in the inspector.
To enter Debug mode, click the 3 dots at the top left of the inspector pane (next to the padlock) and click ‘Debug’.
Next, open your URP Asset, and in my version (2022.3.8f1) the HDR Color Buffer Precision is the 16th item in the inspector. Simply change this drop-down value to 64Bits.
I really think that such an important setting should be Serialized.
I hope this helps somebody
I wonder why Unity not public this setting. How do other people solve this issue? Every URP project will meet the color banding problem. Is it just because not obvious enough so nobody cares?
Not sure but maybe this is a bad idea due to extra memory allocation? Did you check in memory profiler to see how much memory this buffer is eating at high res like running 4K?
“In the HDRP asset, enable FP16 for Color Buffer Format, and also enable higher precision buffers for the color grading, that’s Grading LUT Format and Grading LUT Size. This will have a memory cost.”
Where is this option in HDRP? Cannot find it.
Only: