Experimenting with HLG and HDR10 for still images

Thanks, good to know. If you don’t have an HDR monitor, you can tell mpv what its maximum brightness is (–target-peak=) , then playing a video with brightness set to the highest level

The problem is generating specific metadata so that you can view this file on any HDR monitor or just by using mpv (see above)

Yes, hardware HDR is possible with many file formats, but for stills you don’t have much choice now. AV1 is still in development, HEIC is not free, etc. Would be cool if you could export HDR-tagged photos from your favorite Raw converter in 1 click. If you know how exactly to export the photo from Darktable and get all the necessary tags, please tell us

That is a systemwide solution, they can show HDR even in windowed mode. Seems like we will get this only on Wayland, so also a lot of work to do

Edit: formatting
Edit 2: tried HLG and PQ in Darktable. Only minor difference with HLG and PQ is too dark

1 Like

In theory yes, but in practice, very few content delivery pipelines allow you to actually get this to a display and have it properly interpreted by the display as HDR data.

It’s a no-go on Linux (or at least was a year or two ago), with the ability to set the appropriate HDMI metadata to trigger a display to enter HLG or HDR10 mode a heavy work in progress by Intel, not supported by any graphics driver stack.

Most smart TVs have zero support for HEIF stills, and I have yet to see any solution for displaying these on an HDR-capable display properly.

So far encoding a still to H.265 video with appropriate metadata is the only solution that will work for more than a tiny fraction of a percentage of users.

I’ll try and dig up the commands I used 1-2 years ago, but I started with a 16 bit linear Rec 2020 TIFF and used ffmpeg to do the conversion to an HLG transfer curve and set the ATC SEI metadata properly.

Also, at least a year ago, there was no way to inject ATC SEI metadata for HLG with any hardware encoder without unmerged patches. The method for including HDR10 metadata with vaapi is to use the sei option with the ‘hdr’ flag/option, but this just tells the encoder to embed metadata that is somehow magically associated with the video already - from looking at the source code a year or two ago, there was NO way to set this metadata unless it happened to already be present in an MKV input stream.

Things might have improved since then.

Also the OP says they used webp/vp9, but it’s clear from their command lines that they’re encoding to H.265 using x265

Yes, I use x265 for the last output, but convert to webp before that. Also you can try this command, but it doesn’t solve any problems:
ffmpeg -i 1.tif -pix_fmt yuv420p10le -color_primaries 9 -color_trc 16 -colorspace 9 -color_range 1 -profile:v 2 -vcodec libvpx-vp9 outout.webm

I’ll dig up my old conversion scripts (hopefully tonight), but using webp as an intermediary was definitely not necessary last time I did this. I was able to go directly from linear TIFF to H.265 with appropriate metadata.

1 Like

The reason for my naive example from *.hdr to *.jpg is to show a workflow that uses out of the box tools (GIMP+G’MIC) and format (JPG). As noted in this thread, we haven’t settled on a universal or multiplatform way to view HDR files.

Yes, intermediaries are undesirable. We want it to be as fast and lossless as possible.

1 Like

I have no way of testing these properly on an HDR capable display, but if someone is willing to experiment further, below are some AVIF HLG samples exported from dt.

lilienstein_1k_HLG.zip (1.6 MB)

They look ok to me in Windows and paint.NET…

I only did (XMP attached):

  • pull exposure by -1.36 EV (this is to match HLG 0.5 reference white to sRGB 0.5 reference white)
  • change output profile to HLG Rec2020

lilienstein_1k.hdr.xmp (4.0 KB)

Then I did two exports:

  1. default where profile is left at “image settings” (this doesn’t put the color info into CICP, but inserts an ICC profile instead; note also that CICP should be “unspecified” 2/2/2 in this case instead of 0/0/2 that dt 3.4.1 currently sets - this is already fixed in master for the next release and I edited the bits manually in the zip above)
  2. set explicit export profile to HLG Rec2020 (this sets the CICP flags directly, no ICC profile included)

image

Btw, the forum SW doesn’t allow AVIF (nor HEIF?) upload…

1 Like

Maybe I did something wrong but webm was always incorrect if we compare it to the avif output. _icc from your zip doesn’t work, and the other file clips highlights on the SDR monitor so that may be what we want
Edit: I think Darktable just processes the color but doesn’t set any tags, or does it only for avif. That may explain why PQ was so dark.
Anyway, it is still not a method for today because avif is not mainstream yet. Good to know Darktable already implements that however
Edit 2: setting HLG output manually while exporting clips the highlights, setting webp doesn’t do that

Please elaborate how you’re testing for everyone’s benefit, and how you know your image viewer and OS do the “right thing”?

For example, both ICC and CICP AVIF variants look identical to me in Windows’ paint.NET 4.2.15 on an SDR display, but I’m still not 100% confident if I’m seeing the “correct” rendering, or what I should indeed be expecting to begin with. The midtones look almost the same as the attached sRGB JPEG below (history discarded in dt, just straight import->export), but the highlights not clipped as much and have softer roll-off (I gave them the extra 1.36 EV after all), and I think that’s right, they should just look nicer & brighter on a real HDR display?

OTOH, in Chrome they look different to each other, and both “incorrect” to me (either too dark or too bright & clipped) when compared to the sRGB JPEG.

Edit: Here’s a screenshot of the Windows Explorer thumbnails to show what I mean above; I’m guessing the Windows AV1 plugin from the store behaves the same as paint.NET:

Yes, I’ve used Chromium to compare since Google is a part of AOM. Sorry if I caused some confusion

Yep, the same sequence (JPEG sRGB / AVIF HLG + CICP / AVIF HLG + ICC) in Windows Chrome:

Definitely some way to go for the AVIF ecosystem :frowning:

1 Like

The commands in the OP seem to just tag files as being hdr10 or hlg. No conversion is being done to actually take the input data, make it mean something and then map it to hdr-like signal.

Nornally,a raw file contains ‘data’ which you give meaning during raw conversion. What is black, and what is white and how is this distributed between them, to make it very simple.

At the end, this is ‘tone mapped’ to display space, to the old ‘low dynamic range’. Which had a problem. The minimum (see it as the 0 value) had a clear meaning. It was ‘nothing’. But the other side (the maximum, 1.0, 255 however to see it) had no clear meaning. It just meant ‘as bright as your viewing device can do’ but what means is going to be waaaay different between devices or viewing methods.
if you had to assign a meaning,100nits for the maximum is named some times.

Now, a hdr standard basically defines more meaning to how many nits the maximum really means. And a curve of mapping how the values between minimum and maximum are distributed. Like st2084 or hlg.

So you still have to do ‘raw editing’, but the range available in your final output is greatly increased. And you need to think about how bright things were in the scene, because they need to be aligned at more real defined values now.

Rec2020 is just a colorspace here. Writing things as Rec2020 doesn’t make it hdr… At all.

You could write the file as neutral as you can to a 16bit linear gamma tif. Then with ffmpeg zscale filter interpret it as linear transfer mode, but use the npl parameter to specify how many nits are assigned to your maximum value (you need to figure this out or experiment), then change the transfer characteristics to hlg or hdr10 compatible. Then encode it to an output format of your liking, one that has hdr tagging preferable.

Good reference hdr monitors are expensive and not many different types around.

If you are just experimenting, you could also do it the other way around. Find some hlg or hdr10 images, but with ffmpeg or another tool convert them as if they were NOT hdr. Look at the pixel data, and compare to the image viewed in a proper hdr viewer. That way you get some sort of idea of how it’s made. See what the transfer function does compare to good old sRGB.

Maybe a more simple way could be ‘an ordinary image is most often sRGB colors with sRGB gamma curve’. (or 709 in video world).

A hdr10 image is an image ‘in rec2020 color space and st2084 gamma curve’. A HLG image is an image ‘in rec2020 color siace and HLG gamma curve’.

By saving a file as rec2020 and tagging it as st2084 or HLG, you are not changing the gamma curve, just saying ‘act like it is’.

If you do convert the gamma curve, you’ll notice that two things can happen. Your source is interpreted as being a max of 100 nits or there abouts, and your created hdr file is using only a fraction of the range available. Or, no assumption is made, and your created file is using the whole range available, making everything waaaay to bright. You’ll have to do some tweaking here.

1 Like

Regarding the previous post’s mention of zscale for transfer curve transforms - I dug up my old ffmpeg commands and that’s exactly it.

ffmpeg -loop 1 -r 24 -t 30 -i DSC02708_03.tif -vf pad=1438:2160,zscale=tin=linear:t=arib-std-b67:m=bt2020nc,format=yuv420p10le,pad=3840:2160:1200:0:black -c:v libx265 -preset medium -crf 26 -x265-params "colorprim=bt2020:atc-sei=18:colormatrix=bt2020nc" antelope.mp4

Many TVs become unhappy at unusual/unexpected aspect ratios, which was the reason for the padding tricks. Note that the example above doesn’t use npl - I’m 90% certain you do need to tweak the npl option to get desired results as I believe the default is 100 - e.g. it maps the peak brightness in the TIF to 100 nits which is NOT what you want, I haven’t had time to revisit these efforts other than pulling up some really old command examples so far.

The input TIF was, at the time, a linear TIFF saved from darktable in the Rec.2020 colorspace. I’ll be revisiting this using RT instead sometimes soon (maybe this weekend?). I can’t remember if I scaled the image to be 2160 pixels high prior to saving it out or not.

Also note that ffmpeg in most distributions is not built with zscale enabled as the ZIMG library is not packaged in many distributions, you’ll need to build from source.

OK, here’s the TIF and resulting video from a more recent attempt.

Command line was:
ffmpeg -loop 1 -r 24 -t 30 -i DSC02236.tif -vf pad=3840:2160:302:0:black,zscale=tin=linear:t=arib-std-b67:npl=800:m=bt2020nc,format=yuv420p10le -c:v libx265 -preset medium -crf 26 -x265-params "colorprim=bt2020:atc-sei=18:colormatrix=bt2020nc" naturalbridge.mp4

DSC02236.tif (33.2 MB)

Pixls mangled the MP4, reuploading in a zipfile…

naturalbridge.zip (2.8 MB)

This MP4 file triggers HLG mode when casted to a Vizio P65-F1 using the LocalCast app on Android. Playback via USB should work too?

4 Likes

Yes, that was the point

That’s exactly what we are doing here

Thank you very much, seems to work on my tv too. Now let me do some testing :slight_smile:

To further help with experimentation, there’s an interesting report on the ITU website: https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2408-3-2019-PDF-E.pdf

Tables 1 and 2 show the expected signal levels for reference (18% greycard, greyscale charts etc.) and non-reference items (various skin tones, grass etc.)

They should allow you to adjust the levels in your image to the levels expected in the standard.

2 Likes

Yep. My -1.36 EV compensation was just one “dumb” way to target 21.4% (0.5 of sRGB) for a quick demo, there are many other options depending on one’s needs and preferences.

Ok, it definitely works. @Entropy512 I’m not really familiar with ffmpeg, so maybe you can help here

  1. If we downscale the images in a raw converter rather than when running this command, we can remove pad=3840:2160:302:0:black, right?
  2. Do we need to specify npl for HLG?
  3. What should we change if we want lossless encode? --lossless for x265?

Thanks again

Also, if someone wants to try this, keep in mind that ART’s and RT’s linear curves are the gamma-corrected curves, so we may need gamma 2.2 rtc from this post Linear gamma preview in RawTherapee - #7 by Morgan_Hardwood or use Darktable

Only if you do cropping to get to a 16:9 aspect ratio. If your camera shoots a typical 3:2 aspect ratio, you need to pad out the sides. In this case, I already downscaled to 2160 pixels high in my converter, but that means the input image was only 3236 pixels wide, not 3840. So I pad it to 3840 wide with a black background, offsetting it by 304 pixels from the left to center it. (304 = (3840-3236)/2)

zscale defaults to npl=100 - which maps the brightest parts of the image to only 100 nits and makes the whole image dark.

Don’t know, my concern is that lossless H.265 might make some display devices very unhappy.

Or to ensure that the exported data is linear, create a linear output profile with Rec.2020 color space. Since RT doesn’t have support for HDR displays (and neither does Linux last time I checked), HDR preview is a lost cause, your preview just plain will NOT be accurate compared to what is shown on your TV via the ffmpeg export process, since RT’s preview output will be SDR.

RTv4_Rec2020_Linear_g=1.0.icc|attachment (732 Bytes)

1 Like

Hi,

I’m not sure what you are referring to, can you elaborate? Thanks!