Experimenting with HLG and HDR10 for still images

You can easily find HDR (here I mean Display/hardware HDR, so HDR10/+ or HLG or Dolby Vision) movies, yet it is still unclear how to do the same thing for still images. Some cameras can output HDR jpegs, but I wanted to start from a Raw file.
Currently this is my workflow:

  1. Import a photo from ART, no curve and Rec. 2020 output profile, 16-bit TIFF

  2. Run cwebp photo.tif -q 100 -o photo2.webp
    Without this the final image will be green

  3. Then encode to HLG or HDR10:

HLG:
ffmpeg -i photo2.webp -c:v libx265 -x265-params "hdr-opt=1:repeat-headers=1:pic-struct=0:colorprim=bt2020:colormatrix=bt2020nc:transfer=bt2020-10:atc-sei=18:lossless=1:crf=20" -t 10 result.mkv

HDR10:
ffmpeg -i photo2.webp -c:v libx265 -x265-params "hdr-opt=1:repeat-headers=1:pic-struct=0:colorprim=bt2020:transfer=smpte2084:colormatrix=bt2020nc:transfer=bt2020-10:max-cll=1000:lossless=1:crf=20" -t 10 result.mkv

Can this be done with just one command? If so, is it possible to use close to lossless or lossless options for that? Can the final file be packaged into an image and not a video and what format to choose then (here I used webp/vp9 because that’s what Youtube uses)? Thanks

1 Like

Hi,

I have no experience with HDR content (I do not have the hardware to display it, alas), but since you are using ART, you can set up a custom output format script to be able to generate your mkv files directly from the ART GUI.

HTH

@anon66851888 Welcome to the forum! After a few posts, you should be able to drag and drop files. It would great if we could see your workflow in action.

HLG and PQ are simply transfer functions that convert your image.

Take this 22 EV image (at least according to the website), e.g., from https://hdrihaven.com/hdri/?h=lilienstein. Applying HLG to the [0,255] range gives us this output. (I had to do an extreme clip of the sun prior to applying HLG because it is so much brighter than the rest of the image. If I boost it without clipping, the rest of the image would still be very dark. If I push the curve to the extreme, everything would end up extremely flat.)


Edit Oops, I forgot to answer your question. I basically used GIMP to convert the *.hdr source file to *.tif and then did the clipping and HLG curve application (I hid the actual math in the hlg_ command) on the image with one command as follows:

gmic lilienstein_1k.tif to_rgb cut 0,.1% normalize 0,255 hlg_ {1<<8} round o _out.jpg

I believe that more modern image formats such as HEIF can carry the correct signaling for HLG images.

A bit of googling also shows that both the ISO (who standardise JPEG)[1] and the ICC (who standardise ICC Profiles which can be included in image files)[2] have groups working on adding HDR signaling.

Would be good, once this is standardised, to get it working in software. Is there a way of adding it to a requested features list or similar for popular tools?

[1] ISO - ISO/AWI TS 22028-5 - Photography and graphic technology — Extended colour encodings for digital image storage, manipulation and interchange — Part 5: High dynamic range and wide colour gamut image encoding for still images (HDR/WCG)
[2] http://www.color.org/groups/hdr/index.xalter

1 Like

AVIF also supports this, and darktable can already export directly to AVIF with these output profiles.

1 Like

There are still some use cases where the end-user device may require a more traditional file format such as TIFF or PNG, so support is still very useful.

The ICC draft uses a similar approach to HEIF and AVIF - copying the signaling standardised at the ITU for baseband video - so hopefully will be a fairly easy requirement.

Those are then most likely still stuck with only 8-bit implementations of those formats, so not very conducive to distributing HDR content…

They would need to implement 10 bit or higher, but they’re still useful as I know many who use Open Source toolchains to generate video inserts for instance.

Another approach that seems to be working in the apple ecosystem (although I don’t know how many devices this spans) is outlined in this post: EXR and MacOS
Yes this is not encoding with a HLG or HDR10 curve but with a flexible file format that is used in the compositing and HDRi world. I don’t know if there are any Image viewers on Win10 which can switch on the HDR display environment for this.

Thanks, good to know. If you don’t have an HDR monitor, you can tell mpv what its maximum brightness is (–target-peak=) , then playing a video with brightness set to the highest level

The problem is generating specific metadata so that you can view this file on any HDR monitor or just by using mpv (see above)

Yes, hardware HDR is possible with many file formats, but for stills you don’t have much choice now. AV1 is still in development, HEIC is not free, etc. Would be cool if you could export HDR-tagged photos from your favorite Raw converter in 1 click. If you know how exactly to export the photo from Darktable and get all the necessary tags, please tell us

That is a systemwide solution, they can show HDR even in windowed mode. Seems like we will get this only on Wayland, so also a lot of work to do

Edit: formatting
Edit 2: tried HLG and PQ in Darktable. Only minor difference with HLG and PQ is too dark

1 Like

In theory yes, but in practice, very few content delivery pipelines allow you to actually get this to a display and have it properly interpreted by the display as HDR data.

It’s a no-go on Linux (or at least was a year or two ago), with the ability to set the appropriate HDMI metadata to trigger a display to enter HLG or HDR10 mode a heavy work in progress by Intel, not supported by any graphics driver stack.

Most smart TVs have zero support for HEIF stills, and I have yet to see any solution for displaying these on an HDR-capable display properly.

So far encoding a still to H.265 video with appropriate metadata is the only solution that will work for more than a tiny fraction of a percentage of users.

I’ll try and dig up the commands I used 1-2 years ago, but I started with a 16 bit linear Rec 2020 TIFF and used ffmpeg to do the conversion to an HLG transfer curve and set the ATC SEI metadata properly.

Also, at least a year ago, there was no way to inject ATC SEI metadata for HLG with any hardware encoder without unmerged patches. The method for including HDR10 metadata with vaapi is to use the sei option with the ‘hdr’ flag/option, but this just tells the encoder to embed metadata that is somehow magically associated with the video already - from looking at the source code a year or two ago, there was NO way to set this metadata unless it happened to already be present in an MKV input stream.

Things might have improved since then.

Also the OP says they used webp/vp9, but it’s clear from their command lines that they’re encoding to H.265 using x265

Yes, I use x265 for the last output, but convert to webp before that. Also you can try this command, but it doesn’t solve any problems:
ffmpeg -i 1.tif -pix_fmt yuv420p10le -color_primaries 9 -color_trc 16 -colorspace 9 -color_range 1 -profile:v 2 -vcodec libvpx-vp9 outout.webm

I’ll dig up my old conversion scripts (hopefully tonight), but using webp as an intermediary was definitely not necessary last time I did this. I was able to go directly from linear TIFF to H.265 with appropriate metadata.

1 Like

The reason for my naive example from *.hdr to *.jpg is to show a workflow that uses out of the box tools (GIMP+G’MIC) and format (JPG). As noted in this thread, we haven’t settled on a universal or multiplatform way to view HDR files.

Yes, intermediaries are undesirable. We want it to be as fast and lossless as possible.

1 Like

I have no way of testing these properly on an HDR capable display, but if someone is willing to experiment further, below are some AVIF HLG samples exported from dt.

lilienstein_1k_HLG.zip (1.6 MB)

They look ok to me in Windows and paint.NET…

I only did (XMP attached):

  • pull exposure by -1.36 EV (this is to match HLG 0.5 reference white to sRGB 0.5 reference white)
  • change output profile to HLG Rec2020

lilienstein_1k.hdr.xmp (4.0 KB)

Then I did two exports:

  1. default where profile is left at “image settings” (this doesn’t put the color info into CICP, but inserts an ICC profile instead; note also that CICP should be “unspecified” 2/2/2 in this case instead of 0/0/2 that dt 3.4.1 currently sets - this is already fixed in master for the next release and I edited the bits manually in the zip above)
  2. set explicit export profile to HLG Rec2020 (this sets the CICP flags directly, no ICC profile included)

image

Btw, the forum SW doesn’t allow AVIF (nor HEIF?) upload…

1 Like

Maybe I did something wrong but webm was always incorrect if we compare it to the avif output. _icc from your zip doesn’t work, and the other file clips highlights on the SDR monitor so that may be what we want
Edit: I think Darktable just processes the color but doesn’t set any tags, or does it only for avif. That may explain why PQ was so dark.
Anyway, it is still not a method for today because avif is not mainstream yet. Good to know Darktable already implements that however
Edit 2: setting HLG output manually while exporting clips the highlights, setting webp doesn’t do that

Please elaborate how you’re testing for everyone’s benefit, and how you know your image viewer and OS do the “right thing”?

For example, both ICC and CICP AVIF variants look identical to me in Windows’ paint.NET 4.2.15 on an SDR display, but I’m still not 100% confident if I’m seeing the “correct” rendering, or what I should indeed be expecting to begin with. The midtones look almost the same as the attached sRGB JPEG below (history discarded in dt, just straight import->export), but the highlights not clipped as much and have softer roll-off (I gave them the extra 1.36 EV after all), and I think that’s right, they should just look nicer & brighter on a real HDR display?

OTOH, in Chrome they look different to each other, and both “incorrect” to me (either too dark or too bright & clipped) when compared to the sRGB JPEG.

Edit: Here’s a screenshot of the Windows Explorer thumbnails to show what I mean above; I’m guessing the Windows AV1 plugin from the store behaves the same as paint.NET:

Yes, I’ve used Chromium to compare since Google is a part of AOM. Sorry if I caused some confusion

Yep, the same sequence (JPEG sRGB / AVIF HLG + CICP / AVIF HLG + ICC) in Windows Chrome:

Definitely some way to go for the AVIF ecosystem :frowning:

1 Like

The commands in the OP seem to just tag files as being hdr10 or hlg. No conversion is being done to actually take the input data, make it mean something and then map it to hdr-like signal.

Nornally,a raw file contains ‘data’ which you give meaning during raw conversion. What is black, and what is white and how is this distributed between them, to make it very simple.

At the end, this is ‘tone mapped’ to display space, to the old ‘low dynamic range’. Which had a problem. The minimum (see it as the 0 value) had a clear meaning. It was ‘nothing’. But the other side (the maximum, 1.0, 255 however to see it) had no clear meaning. It just meant ‘as bright as your viewing device can do’ but what means is going to be waaaay different between devices or viewing methods.
if you had to assign a meaning,100nits for the maximum is named some times.

Now, a hdr standard basically defines more meaning to how many nits the maximum really means. And a curve of mapping how the values between minimum and maximum are distributed. Like st2084 or hlg.

So you still have to do ‘raw editing’, but the range available in your final output is greatly increased. And you need to think about how bright things were in the scene, because they need to be aligned at more real defined values now.

Rec2020 is just a colorspace here. Writing things as Rec2020 doesn’t make it hdr… At all.

You could write the file as neutral as you can to a 16bit linear gamma tif. Then with ffmpeg zscale filter interpret it as linear transfer mode, but use the npl parameter to specify how many nits are assigned to your maximum value (you need to figure this out or experiment), then change the transfer characteristics to hlg or hdr10 compatible. Then encode it to an output format of your liking, one that has hdr tagging preferable.

Good reference hdr monitors are expensive and not many different types around.

If you are just experimenting, you could also do it the other way around. Find some hlg or hdr10 images, but with ffmpeg or another tool convert them as if they were NOT hdr. Look at the pixel data, and compare to the image viewed in a proper hdr viewer. That way you get some sort of idea of how it’s made. See what the transfer function does compare to good old sRGB.

Maybe a more simple way could be ‘an ordinary image is most often sRGB colors with sRGB gamma curve’. (or 709 in video world).

A hdr10 image is an image ‘in rec2020 color space and st2084 gamma curve’. A HLG image is an image ‘in rec2020 color siace and HLG gamma curve’.

By saving a file as rec2020 and tagging it as st2084 or HLG, you are not changing the gamma curve, just saying ‘act like it is’.

If you do convert the gamma curve, you’ll notice that two things can happen. Your source is interpreted as being a max of 100 nits or there abouts, and your created hdr file is using only a fraction of the range available. Or, no assumption is made, and your created file is using the whole range available, making everything waaaay to bright. You’ll have to do some tweaking here.

1 Like