The point of HLG is to provide a fall back for SDR displays. The HDR image would look normal on them. In terms of peak brightness, it depends on both on your software and hardware specs and configurations.
Not really true - âacceptableâ but not ânormalâ. Itâs kind of like how the FCC in the United States completely botched the definition of âminimum viewableâ signal strength for the digital transition. It turned out that what the FCC defined as âminimumâ was above what many families (including my own) considered âexcellentâ signal quality, with the end result being a MASSIVE broadcast area coverage loss during the transition.
HLG in fallback mode looks quite meh, ESPECIALLY if itâs Rec.2020 HLG. (because the majority of displays in fallback mode donât grok that itâs Rec2020 gamut and assume itâs 709, which desaturates everything) Also, one canât properly take advantage of an HDR display without resulting in the SDR fallback looking washed out.
This appears to be why a large amount of HDR content out there (basically anything that was mastered to be able to ever display on an SDR display, even if the delivery pipelines were separate) appears to use a filmlike S-curve anyway, leading to a lot of HDR content not looking that different from SDR except for âHDR showcaseâ content where SDR compatibility was never considered at any step of the content production process.
Yes, I should have used scare quotes.
Actually, we donât need to encode anything. Remember how HLG works? We can just emulate it by using a Flexible curve with a single point at, for example, input 40 and output 87 and save the image. It will look just fine on your normal SDR monitor, but will also look better on any monitor if you turn its brightness up to 11. Other curves like that one work too, of course we donât have any metadata and other fancy stuff, but Iâd say it still looks pretty good, you can try it out yourself
HLG is scene referred (at least in the same way that SDR video is). Pedantic, IMHO, is arguing that everything is display referred because everyone makes grading decisions while looking at a display.
Btw, who has a problem with how zscale operates? You can see in your link to gamma.cpp that peak_luminance isnât used when scene_referred is true. That zscale doesnât expose the scene referred conversions doesnât seem like evidence of anything to me.
I actually have discussed exposing it with sekrit-twc. Scene referred HLG is explicitly forbidden right now.
[2018-02-23 11:06:11] <ifb> setting nominal_luminance=203 puts SDR diffuse white at 75% HLG like it should, so I'm pretty sure it's correct
[2018-02-23 11:06:33] <Arwen> nominal_peak_luminance specifies the physical luminance of 100% SDR, which is mapped to the same light level in HDR
[2018-02-23 11:06:47] <Arwen> HLG is always 1000 cd/m^2 in z.lib
[2018-02-23 11:09:13] <Arwen> The linear SDR display light is scaled to ensure that 100% of the SDR signal is mapp
[2018-02-23 11:09:15] <Arwen> ed to the HLG
[2018-02-23 11:09:17] <Arwen> reference level 75 %HLG.
[2018-02-23 11:09:30] <Arwen> this isn't done unless you manipulate nominal_peak_luminance to make it happen
[2018-02-23 11:11:16] <ifb> so, for scene-referred (figure 42) , it's on me to do whatever OOTF adjustment I want after i've converted to linear?
[2018-02-23 11:11:34] <Arwen> scene-referred conversions are actually implemented internally, but not exposed
[2018-02-23 11:11:39] <Arwen> because nobody's asked for it
[2018-02-23 11:11:53] <ifb> and setting nominal_luminance is equivalent to messing with the "scaling" box in the diagram
[2018-02-23 11:12:03] <Arwen> yes, more or less
[2018-02-23 11:12:24] <Arwen> it is used to multiply the linear light value after applying any transfer functions
[2018-02-23 11:13:58] <ifb> Arwen: so can I have scene-referred conversions? :)
[2018-02-23 11:14:17] <Arwen> maybe?
[2018-02-23 11:14:23] <Arwen> scene referred is usually a mistake
[2018-02-23 11:14:45] <ifb> not when I'm actually mixing SDR footage with native HLG
[2018-02-23 11:16:29] <Arwen> I can add it
For my use case, I ended up just using a LUT because it matched hardware converters better than commenting out the assert in zscale and doing a mathematically-correct conversion.
If anyone is curious, BBC has guidelines covering that 1% of the time when you might need to change between 709 and HLG âlooks.â Iâm sure sekrit-twc could be convinced to allow scene referred HLG in zscale if that 1% estimate is wrong.
So the TL;DR is - your pedantry regarding IT HAS TO BE SCENE REFERRED broke your workflow, while my non-pedantic workflow works just fine.
zimg has not had any commits to master since Jan 11, and the command I posted was a month after that and it works just fine.
I needed scene referred conversions which werenât available (still arenât). You donât so it works fine. Iâm not arguing anything is broken or what youâre doing is wrong. Itâs simply a different use case. Thatâs not pedantry. The output/result is different. Me punting and using a LUT has more to do with having to match a hardware device than any shortcomings with zscale. It still irks me some, but itâs out of my control.
Iâm simply confused by your apparent assertion that zscale uses a 1000 nit display for HLG operations and this means that HLG is display referred in the real world. I brought up the scene referred stuff just as a counter example. zscale supports both conversion approaches and there are use cases for both in the real world. Sorry if Iâm putting words in your mouth.
TL;DR, everything @st599 has posted is correct.
For reference, nominal peak luminance of 1000 nits for HLG is valid per ITU standards:
A few discussions of real-world reasons of why specifying a nominal peak luminance is necessary for HLG can be referenced from Re: HDR in 2023 Photography: Photographic Science and Technology Forum: Digital Photography Review (which references BT.2408)