Setting Color Buffer Format to 16 for Post Processing causes major visual differences

Is there an explanation of how PostProcessing acts differently between its three available color buffer format? R11G11B10 vs R16G16B16A16 produces very different images from the same scene. R32 acta like R16.

Here is R11G11B10 with exaggerated bloom values:

…and here is the same shot under 16:

I assume this is either a bug, or something that we have to take into account when working with higher precision color buffer.

Cheers!

1 Like

I’d also add that the vignette intensity is also broken in higher precision format.

At max values normally:

At max values with RGBA16

Hey, which is version of Unity and HDRP do you use?

Unity 2019.4.1f1
HDRP 7.4.1

Thank you for this info. I have just double-checked on several scenes using HDRP 7.4.1 and I’m not experiencing any problems with vignette, for instance, it looks consistent no matter the precision.

Maybe see if removing HDRP and re-adding it helps.

Hey, our graphics QA tried to repro your issue, but was unsuccessful. Same on my side.

A few things to try:

  • update your display drivers
  • remove and re-install HDRP from the package manager
  • delete the folders located in your Library folder (this will force a re-import of the project)

I will test things more thoroughly once we ship our milestone on Tuesday, but the problem happens on all 9 PC we have here so I doubt that it is a library problem. We do have to update Unity and HDRP though so that might fix it.

Thanks for the follow up!