HDR output (Rec 2020, Dolby Vision, HDR10)

Since we have no HDR display in our studio to test it out, does anyone know if supporting newly introduced HDR display standards, such as Dolby Vision or HDR10, only a matter of not using a Tonemapper on a HDR-enabled camera? Or is Unity always converting the source image to LDR before sending it to the display?

Also on topic, what happens when Unity renders on not-specifically-HDR, but still wider than-ususal color gamut screens? Does the color information get converted/chopped/capped or is it sent to the screen as-is? Examples of these screens are 2015+ Macbook with P3 standard and the newly introduced iPhone 7 (which on paper looks HDR10-compliant, but isn’t advertized as such).

Unity does not support HDR monitors. Just enabling HDR on the camera won’t work in this case. Please vote here to add this feature: