My game uses HDR rendering with Post Processing Stack, but as far as I can see, on iOS (at least, on my Mac in the editor when set to iOS, but it also looks the same on device, because there is visible clipping and banding I wouldn’t expect) the camera seems to be rendering to RGBA8 before the post is applied, at which point you’ve already lost all the HDR info before the post chain even starts.
Looking in the Frame Debugger (which I don’t think I can connect to device) I see a TempBuffer (presumably created by the Post Processing Stack) of BGRA8_SRGB being used for rendering during the main forward pass of the main camera, even though Graphics.ActiveTier is Tier3, and RenderTexture.DefaultHDRFormat returns B10G11R11_UFloatPack32.
When running on iOS, with BIRP, is there a way that I can have the camera render to 16-bit (or at least B10G11R11_UFloatPack32) so that auto exposure and color grading won’t band horribly?
In Editor/OSX (and also on Windows desktop) I’m correctly seeing a rendertarget used for drawing in FP16 format.
Thanks