Experimenting with HLG and HDR10 for still images

Also, if someone wants to try this, keep in mind that ART’s and RT’s linear curves are the gamma-corrected curves, so we may need gamma 2.2 rtc from this post Linear gamma preview in RawTherapee - #7 by Morgan_Hardwood or use Darktable

Only if you do cropping to get to a 16:9 aspect ratio. If your camera shoots a typical 3:2 aspect ratio, you need to pad out the sides. In this case, I already downscaled to 2160 pixels high in my converter, but that means the input image was only 3236 pixels wide, not 3840. So I pad it to 3840 wide with a black background, offsetting it by 304 pixels from the left to center it. (304 = (3840-3236)/2)

zscale defaults to npl=100 - which maps the brightest parts of the image to only 100 nits and makes the whole image dark.

Don’t know, my concern is that lossless H.265 might make some display devices very unhappy.

Or to ensure that the exported data is linear, create a linear output profile with Rec.2020 color space. Since RT doesn’t have support for HDR displays (and neither does Linux last time I checked), HDR preview is a lost cause, your preview just plain will NOT be accurate compared to what is shown on your TV via the ffmpeg export process, since RT’s preview output will be SDR.

RTv4_Rec2020_Linear_g=1.0.icc|attachment (732 Bytes)

1 Like

Hi,

I’m not sure what you are referring to, can you elaborate? Thanks!

HLG doesn’t have a concept of NITs unlike PQ, so what is this command doing?

And let’s not forget you also have the soft proofing tool in darktable that lets you set the histogram display for HLG/PQ to help you place the level of your subject of interest where you want. (This can be useful even if you end up exporting to linear 16-bit/float TIFF for later encoding…)

“The curve and histogram is always displayed with sRGB gamma, regardless of working or output profile. This means that the shadow range is expanded and highlights compressed to better match human perception”
https://rawpedia.rawtherapee.com/Exposure#Tone_Curves

yes, but that refers to the “creative” tone curve, it has nothing to do with the TRC of the output profile. If you set it to linear, it will be linear (and in fact, even in the “tone curve tool”, if you use an identity curve, the gamma has no effect).

1 Like

Sorry, seems like I misunderstood that

Hmm? I’m seeing references to cd/m2 (aka NITS) when discussing HLG in Rec.2100 (page 7 of https://www.itu.int/dms_pubrec/itu-r/rec/bt/R-REC-BT.2100-2-201807-I!!PDF-E.pdf )

zimg’s API documentation describes npl in:

zimg also pretty clearly assumes that ARIB B67 has a peak luminance of 1000 nits:

1 Like

HLG is a relative system with black at 0 and peak at 1.0. For different screen brightnesses it has a transfer curve adjustment specified - see note 5f in ITU-R BT.2100.

The %age values for reference levels are constant as the signal doesn’t change, just the transfer curve in the monitor.

Note 5f – For displays with nominal peak luminance (LW) greater than 1000 cd/m2, or where the effective nominal peak luminance is reduced through the use of a contrast control, the system gamma value should be adjusted according to the formula below, and may be rounded to three significant digits:

γ(systemGamma) = 1.2 + 0.42 Log10 (Lw/1000)

where Lw is the nominal peak luminance

1 Like

AFAIK the HLG adjustments only apply to the display (decode) side. Content should still be “mastered” and encoded for the reference 1000 nits, as one doesn’t know in advance what display will be used to view it.

2 Likes

No, HLG is scene referred. The levels represent the illumination of the sensor not a master display.

You can master on any HLG display and it will appear perceptually similar on any other HLG display (provided that they correctly implement the system gamma listed above). So diffuse white appears at 75% signal level, which will be displayed at different brightnesses depending on the monitor’s peak brightness.

1 Like

You can be pedantic all you want, but here’s the facts:
The standard specifies a reference peak level of 1000 nits for a typical/reference display
The standard specifies that it is the display’s job to alter its tone curve in a specific manner if it has a different peak
The standard specifies no absolute references for scene luminance, only relative luminance to an arbitrary number that has no units or any connection to reality

Claim it’s scene referred all you want - for any practical real-world purpose, it is not because your only real-world reference point to start from is an assumption of 1000 nits for a reference/standard/typical display. Myself, I live in the real world. If you have a problem with how zscale operates, raise an issue on its github project and take it up with sekrit-twc.

1 Like

The European spec for reference monitors has multiple minimum peak brightnesses listed for different locations and uses and only lists a minimum, not a fixed value. There’s no one fixed monitor brightness for HLG. https://tech.ebu.ch/docs/tech/tech3320.pdf

1 Like

The point of HLG is to provide a fall back for SDR displays. The HDR image would look normal on them. In terms of peak brightness, it depends on both on your software and hardware specs and configurations.

Not really true - “acceptable” but not “normal”. It’s kind of like how the FCC in the United States completely botched the definition of “minimum viewable” signal strength for the digital transition. It turned out that what the FCC defined as “minimum” was above what many families (including my own) considered “excellent” signal quality, with the end result being a MASSIVE broadcast area coverage loss during the transition.

HLG in fallback mode looks quite meh, ESPECIALLY if it’s Rec.2020 HLG. (because the majority of displays in fallback mode don’t grok that it’s Rec2020 gamut and assume it’s 709, which desaturates everything) Also, one can’t properly take advantage of an HDR display without resulting in the SDR fallback looking washed out.

This appears to be why a large amount of HDR content out there (basically anything that was mastered to be able to ever display on an SDR display, even if the delivery pipelines were separate) appears to use a filmlike S-curve anyway, leading to a lot of HDR content not looking that different from SDR except for “HDR showcase” content where SDR compatibility was never considered at any step of the content production process.

Yes, I should have used scare quotes.

Actually, we don’t need to encode anything. Remember how HLG works? We can just emulate it by using a Flexible curve with a single point at, for example, input 40 and output 87 and save the image. It will look just fine on your normal SDR monitor, but will also look better on any monitor if you turn its brightness up to 11. Other curves like that one work too, of course we don’t have any metadata and other fancy stuff, but I’d say it still looks pretty good, you can try it out yourself

HLG is scene referred (at least in the same way that SDR video is). Pedantic, IMHO, is arguing that everything is display referred because everyone makes grading decisions while looking at a display.

Btw, who has a problem with how zscale operates? You can see in your link to gamma.cpp that peak_luminance isn’t used when scene_referred is true. That zscale doesn’t expose the scene referred conversions doesn’t seem like evidence of anything to me.

I actually have discussed exposing it with sekrit-twc. Scene referred HLG is explicitly forbidden right now.

[2018-02-23 11:06:11] <ifb> setting nominal_luminance=203 puts SDR diffuse white at 75% HLG like it should, so I'm pretty sure it's correct
[2018-02-23 11:06:33] <Arwen> nominal_peak_luminance specifies the physical luminance of 100% SDR, which is mapped to the same light level in HDR
[2018-02-23 11:06:47] <Arwen> HLG is always 1000 cd/m^2 in z.lib
[2018-02-23 11:09:13] <Arwen> The linear SDR display light is scaled to ensure that 100% of the SDR signal is mapp
[2018-02-23 11:09:15] <Arwen> ed to the HLG 
[2018-02-23 11:09:17] <Arwen> reference  level  75  %HLG.
[2018-02-23 11:09:30] <Arwen> this isn't done unless you manipulate nominal_peak_luminance to make it happen
[2018-02-23 11:11:16] <ifb> so, for scene-referred (figure 42) , it's on me to do whatever OOTF adjustment I want after i've converted to linear?
[2018-02-23 11:11:34] <Arwen> scene-referred conversions are actually implemented internally, but not exposed
[2018-02-23 11:11:39] <Arwen> because nobody's asked for it
[2018-02-23 11:11:53] <ifb> and setting nominal_luminance is equivalent to messing with the "scaling" box in the diagram
[2018-02-23 11:12:03] <Arwen> yes, more or less
[2018-02-23 11:12:24] <Arwen> it is used to multiply the linear light value after applying any transfer functions
[2018-02-23 11:13:58] <ifb> Arwen:  so can I have scene-referred conversions? :)
[2018-02-23 11:14:17] <Arwen> maybe?
[2018-02-23 11:14:23] <Arwen> scene referred is usually a mistake
[2018-02-23 11:14:45] <ifb> not when I'm actually mixing SDR footage with native HLG
[2018-02-23 11:16:29] <Arwen> I can add it

For my use case, I ended up just using a LUT because it matched hardware converters better than commenting out the assert in zscale and doing a mathematically-correct conversion. :man_shrugging:

If anyone is curious, BBC has guidelines covering that 1% of the time when you might need to change between 709 and HLG “looks.” I’m sure sekrit-twc could be convinced to allow scene referred HLG in zscale if that 1% estimate is wrong. :slight_smile: