Experimenting with HLG and HDR10 for still images

Yep. My -1.36 EV compensation was just one “dumb” way to target 21.4% (0.5 of sRGB) for a quick demo, there are many other options depending on one’s needs and preferences.

Ok, it definitely works. @Entropy512 I’m not really familiar with ffmpeg, so maybe you can help here

  1. If we downscale the images in a raw converter rather than when running this command, we can remove pad=3840:2160:302:0:black, right?
  2. Do we need to specify npl for HLG?
  3. What should we change if we want lossless encode? --lossless for x265?

Thanks again

Also, if someone wants to try this, keep in mind that ART’s and RT’s linear curves are the gamma-corrected curves, so we may need gamma 2.2 rtc from this post Linear gamma preview in RawTherapee - #7 by Morgan_Hardwood or use Darktable

Only if you do cropping to get to a 16:9 aspect ratio. If your camera shoots a typical 3:2 aspect ratio, you need to pad out the sides. In this case, I already downscaled to 2160 pixels high in my converter, but that means the input image was only 3236 pixels wide, not 3840. So I pad it to 3840 wide with a black background, offsetting it by 304 pixels from the left to center it. (304 = (3840-3236)/2)

zscale defaults to npl=100 - which maps the brightest parts of the image to only 100 nits and makes the whole image dark.

Don’t know, my concern is that lossless H.265 might make some display devices very unhappy.

Or to ensure that the exported data is linear, create a linear output profile with Rec.2020 color space. Since RT doesn’t have support for HDR displays (and neither does Linux last time I checked), HDR preview is a lost cause, your preview just plain will NOT be accurate compared to what is shown on your TV via the ffmpeg export process, since RT’s preview output will be SDR.

RTv4_Rec2020_Linear_g=1.0.icc|attachment (732 Bytes)

1 Like

Hi,

I’m not sure what you are referring to, can you elaborate? Thanks!

HLG doesn’t have a concept of NITs unlike PQ, so what is this command doing?

And let’s not forget you also have the soft proofing tool in darktable that lets you set the histogram display for HLG/PQ to help you place the level of your subject of interest where you want. (This can be useful even if you end up exporting to linear 16-bit/float TIFF for later encoding…)

“The curve and histogram is always displayed with sRGB gamma, regardless of working or output profile. This means that the shadow range is expanded and highlights compressed to better match human perception”
https://rawpedia.rawtherapee.com/Exposure#Tone_Curves

yes, but that refers to the “creative” tone curve, it has nothing to do with the TRC of the output profile. If you set it to linear, it will be linear (and in fact, even in the “tone curve tool”, if you use an identity curve, the gamma has no effect).

1 Like

Sorry, seems like I misunderstood that

Hmm? I’m seeing references to cd/m2 (aka NITS) when discussing HLG in Rec.2100 (page 7 of https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2100-2-201807-I!!PDF-E.pdf )

zimg’s API documentation describes npl in:

zimg also pretty clearly assumes that ARIB B67 has a peak luminance of 1000 nits:

1 Like

HLG is a relative system with black at 0 and peak at 1.0. For different screen brightnesses it has a transfer curve adjustment specified - see note 5f in ITU-R BT.2100.

The %age values for reference levels are constant as the signal doesn’t change, just the transfer curve in the monitor.

Note 5f – For displays with nominal peak luminance (LW) greater than 1000 cd/m2, or where the effective nominal peak luminance is reduced through the use of a contrast control, the system gamma value should be adjusted according to the formula below, and may be rounded to three significant digits:

γ(systemGamma) = 1.2 + 0.42 Log10 (Lw/1000)

where Lw is the nominal peak luminance

1 Like

AFAIK the HLG adjustments only apply to the display (decode) side. Content should still be “mastered” and encoded for the reference 1000 nits, as one doesn’t know in advance what display will be used to view it.

2 Likes

No, HLG is scene referred. The levels represent the illumination of the sensor not a master display.

You can master on any HLG display and it will appear perceptually similar on any other HLG display (provided that they correctly implement the system gamma listed above). So diffuse white appears at 75% signal level, which will be displayed at different brightnesses depending on the monitor’s peak brightness.

1 Like

You can be pedantic all you want, but here’s the facts:
The standard specifies a reference peak level of 1000 nits for a typical/reference display
The standard specifies that it is the display’s job to alter its tone curve in a specific manner if it has a different peak
The standard specifies no absolute references for scene luminance, only relative luminance to an arbitrary number that has no units or any connection to reality

Claim it’s scene referred all you want - for any practical real-world purpose, it is not because your only real-world reference point to start from is an assumption of 1000 nits for a reference/standard/typical display. Myself, I live in the real world. If you have a problem with how zscale operates, raise an issue on its github project and take it up with sekrit-twc.

1 Like

The European spec for reference monitors has multiple minimum peak brightnesses listed for different locations and uses and only lists a minimum, not a fixed value. There’s no one fixed monitor brightness for HLG. https://tech.ebu.ch/docs/tech/tech3320.pdf

1 Like

The point of HLG is to provide a fall back for SDR displays. The HDR image would look normal on them. In terms of peak brightness, it depends on both on your software and hardware specs and configurations.

Not really true - “acceptable” but not “normal”. It’s kind of like how the FCC in the United States completely botched the definition of “minimum viewable” signal strength for the digital transition. It turned out that what the FCC defined as “minimum” was above what many families (including my own) considered “excellent” signal quality, with the end result being a MASSIVE broadcast area coverage loss during the transition.

HLG in fallback mode looks quite meh, ESPECIALLY if it’s Rec.2020 HLG. (because the majority of displays in fallback mode don’t grok that it’s Rec2020 gamut and assume it’s 709, which desaturates everything) Also, one can’t properly take advantage of an HDR display without resulting in the SDR fallback looking washed out.

This appears to be why a large amount of HDR content out there (basically anything that was mastered to be able to ever display on an SDR display, even if the delivery pipelines were separate) appears to use a filmlike S-curve anyway, leading to a lot of HDR content not looking that different from SDR except for “HDR showcase” content where SDR compatibility was never considered at any step of the content production process.

Yes, I should have used scare quotes.