Mastering workflow for linear images for HDR displays

I am a bit baffled by this. This is easy to measure and is measured by people testing displays. For the subjective quality…aka what you think of this, any electronics store can be enlightening.

true, but again, this can be measured. static on-off contrast, simultaneous contrast (ANSI contrast), and I wouldn’t be surprised if people measure for HDR displays short term peak contrast now. It exists for a reason not as a marketing gimmick.

You lost me there a bit. :slight_smile: All those things happen already. It seems photographers just aren’t aware of this. HDR10 displays, the bog standard one, not the ones with the fancy dynamic metadata for real time gamut remapping(!) (a.k.a. HDR10+ and DolbyVision) look mindblowingly good. I actually want to know how much better a HDR10 mastering display looks…but just a consumer HDR10 is simply amazing.

That is probably not the full story, since AFAIK people also prefer pictures with a higher contrast even when shown on the same display (i.e. when the contrast slider is pushed just a bit further of course can be pushed to far), so it is not only due to higher dynamic range.

but you won’t see the impact of dynamic range since the source image can’t have a wider dynamic range, you just stretch that dynamic range of a larger contrast area probably ruining the artists intent. Sure this might look good to some people but isn’t what you should be looking for when color accuracy is important.

Without exactly knowing what the consumer screens do (it is still a black box color management wise) I wouldn’t be surprised to learn that those screens actually increase contrast and thus a good mastering screen will actually look “worse” due to it being much more color accurate. Also see this twitter thread I linked to earlier.

Honestly, me too… I didn’t think that having more contrast (all else being equal) could not be considered a good thing. But maybe I misunderstood the other posts?

1 Like

Yes, lets say for example that I have a poor contrast (SDR) screen on which I edit my pictures while you have a nice high contrast (SDR) screen, any pictures I send you will look different and in some cases can be totally ruined (vice-versa is also true and might even be worse), this is because in the SDR world there is no encoding for how bright something actually should be. This is one theoretical benefit of HDR encoding since the Electrical Optical Transfer Function (EOTF for short) actually encodes digital input to analog output (e.g. with the pq EOTF 1024 (10 bit pc), 1.0 (float) or 940 (10bit tv legal) is 1000nits per definition)

1 Like

This is a huge not so theoretical benefit over i.e. sRGB actually. It allows for vastly different display tech to be much much closer together in appearance than in sRGB. Even in the case the display has to do internal black box shenanigans when out-of-gamut or out-of-luminance is reached. So right there, ‘HDR’ to throw around that term as loosly as possible, is better.

( I hope I am not being misunderstood, out-of-gamut and out-of-luminance is still a problem then as is could look horrible if the implementation is wrong. All else from the get go is better defined though.)

I agree it is an amazing benefit, I call it theoretical since it assumes that people will actually follow the spec, and we all know how companies (especially in the consumer space) like to skirt those lines a bit (understatement). There are a lot of in theory amazing spec out there that due to being completely abused are not worth the paper they are written on only time well tell if the HDR specs will be among those or not.

1 Like

But that contrast slider manipulates screen space luminance. I am not sure how this is not creating a higher dynamic range presentation…it is.

well, as a photographer I sometimes have an intent how things have to look. If my display is too low contrast in comparison to whatever standard, color accuracy is not helping me. I want to know, and decide, that the luminances that I assign to certain elements in a scene, are looking how I want them to look. The contrast slider is a poor mans mastering tool for that. My camera shoots easily 3 or 4 stops more than I can display. I have to employ DR-compression to get everything into what I want to show. Now there is better display tech solving this to a certain extent. Gaming and Movies are already using that tech, Photographers are very late to that game.

I get the point that you’re making. But as you wrote above, the standard already allows for much less of this fudging blackbox implementation. I read Dan Olsons thread and I partly agree. What his thread boils down to is this wonderfully snarky comment

So standards are being developed for a display tech that isn’t ready, and thus compensation for the not-ready display tech is being baked into the standards, and I’m sure that will have zero downsides in the future

The HDR standards (unfortunately it’s more than one) are the first display standards to allow for display tech to improve while not giving garbage on the way to the ‘perfect’-display. Yes there will be bogus implementations of companies slapping badges on stuff. This was way worse with SDR displays though. Orange reds, yellow greens, 6bit display drivers…If the standard forces everyone to up their game, I am all for it.

I have the feeling that when smartphone photography implements HDR editing of their photos, real photographers will quickly want this too. Another death-blow for traditional camera-makers is waiting.

(Also I think it’s obvious that I want this technology to work, maybe a bit too much actually :smile: )

Not talking about the contrast slider on the screen, talking about the contrast slider in an image editing application.

Gamin is full CG and movies have been moving to a full scene referred workflow for a while now (mostly to make is easier to compose CG elements), both of these things make it much easier to take advantage of HDR tech. Photography - at least as we know it now - having its roots in desktop publishing has a much harder time taking advantage of this tech. On top of that big game studios and movie studios can easily pay for the equipment needed to properly master, which at current price ranges (and needed skill level to properly use) are not readily available to most photographers.

Are you sure because as far as I am aware the in screen black box to has to take in rec.2020-pq/hlg to whatever the screen can do has a lot of leeway in “make it look good” to the viewer. So this

Will definitely still be a thing

And this is I think the real difference between us, I also want the tech to work I am just a lot more pessimistic about the prospects of it ever working as intended in the context of being able to create our own HDR content on consumer devices.

1 Like

@PhotoPhysicsGuy Sorry for the hyperbole earlier. It can be hard to detect. :stuck_out_tongue:

1 Like

Same here. The contrast slider in an image editing app ultimately decides what luminances (luminance difference anchored around whatever middle grey is) will actually be displayed. If implemented crudely it does not process in linear light but something else and thus potentially wrong, but that is another discussion.

Since I dabble in RAW photography (ca.2007) manipulating linear light RAW files is around. darktable has made large pushes to have an all out linear light pipeline (scene referred and not display referred if I am not mistaken). How is it easier for games and movies to take advantage of HDR tech? I am not clear what you mean here. Photographers regularily use exposure bracketing to make *.exr’s to acquire high dynamic range scenes which they then have to master on SDR displays or for print. Clearly the only part missing in the photographers pipeline is displaying HDR content without luminance mapping to SDR. I admit that with competing HDR standards, lack of good enough mastering displays and photographers not even clear about the advantages of an HDR displayed image, this will still take some time. Again, it’s a mystery to me why Krita is further in this respect.

The only thing needed to check is a color-probe to see how accurate a display is for color and brightness. That should give a nice idea about how out-of-display-gamut colors are mapped back into display gamut. Same with luminance. Whatever you need to assess a sRGB display calibration works mostly for HDR displays. On top of that, luminance relations are much better defined for HDR displays. There is some more to it as temporal and local peak brightnesses can be higer, but that’s not voodoo. So yeah, I can buy a completely uncalibrated sRGB display with the wrong gamma, 6bit dithering with a smaller than sRGB gamut. With a HDR display, some of those things are a no-go from the start and can then easily be tested (although probably not corrected). Again, I am sure there will be crap gamut remapping algorithms and crap peak brightness clipping and such. But I would already be quite surprised if deltaE numbers will be as off for in-display-gamut colors as they are for some cheap sRGB monitors nowadays.

There is a nice comparison of a colorist who did some screen test with the apple super duper display vs. a Sony HDR10 mastering OLED I think (spoiler: of course it cannot compete). I’ll see if I can find it. The pessimist in you will love it :smile:

EDIT: there you go! :smile:
https://youtu.be/rtd7UzLJHrU

According to this reviewer, Apple’s new monitor is for lawyers and Youtub-ers, so my hyperbolic statements about HDR marketing weren’t so dramatic after all. :stuck_out_tongue_closed_eyes:

1 Like

In the most used commercial (lightroom) application linear light is still not a thing, in this regard OSS applications are somewhat ahead actually.

Not only have they been using scene linear rendering for some time now (games with physics based lightning, movies for the compositing benefits) they also are mainly output on screens. In contrast photography has focused (and for some people is still the focus of) on making prints (either dedicated photo prints or in books and magazines), which is a notorious low dynamic range medium (lower than even a lot of SDR screens!) So until fairly recently there was no interest in actually outputting HDR and even currently it is mostly a curiosity (especially for people interested in selling prints)

Except that a HDR screen is allowed to use different algorithms depending on the metadata (sure HDR10 doesn’t have dynamic metadata it still has some) so for the above to be valid that must always be set the same for one. Secondly I think you are way to optimistic about these things, if there are cheap shortcuts that still look good to 90% of the people those will be taken and the only way to be sure of what you get would be to bypass the blackbox entirely.

Oof I hadn’t realized a 5k screen would be this bad, sure it has nice sustained peak brightness but the blotchiness of the dark scenes was just awful (ok that is in comparison the the reference monitor, it would probably do pretty well in comparison to other (consumer) HDR screens)

1 Like

Without getting into the in-depth discussion, here’s also another recent useful HDR display overview for interested photographers, with the same conclusion that we’re sadly not there yet for stills…

2 Likes

The following ITU HDR production report is also very insightful from a technical perspective. For example, one take-away for me was that the content creator is no longer in control of (or shouldn’t worry about) the EETF (tone mapping) since each HDR display should implement a specific one according to its output display range abilities (this of course assumes the HDR display vendor “does the right thing”)…

A very different paradigm to today’s SDR displays where tone mapping is in the hands of the creator and the SDR display is assumed to only do “gamma”. It’s almost like having to render your content for an imaginary target (e.g. idealized PQ space), and hope the display vendor does a decent job of reproducing it. If your mastering HDR display happens not to be very capable so you can’t visualize that imaginary HDR space well enough, then you seem to run a higher risk than rendering for SDR where the display ecosystem is settled… HLG in that case seems safer (workflow more similar to SDR), but provides less of an “wow” effect compared to PQ.

1 Like

Yes, this is very much a difference indeed. Viewing habits are not ‘locked’ to low DR prints.

In the analog days there were slideshows to be had though…something that really would benefit from 4k+ resolution and HDR. Positives never had the DR that negatives could capture. I’ve had the luck to see a 6x6 Velvia 50 projection…Ooof.

At the same time, photographers know and use tonemapping on a daily basis…and the hassle that it brings to the table.

Having a display that does less tonemapping than your average sRGB display seem SO worthwhile to me.

Am I though? The shortcuts still would exceed anything you know from your current SDR screen. (the you is more directed to the reader rather than you personally)

that’s that apple markup and shady shady marketing of them (form my POV). That channel is good though, worthile watching for the state of display affairs.

Exactly! Almost like today where Photographers regularily complain that no-one will be able to appreciate their nuanced color work because no-one has a calibrated or decent enough screen or can’t see the thirteen revision that went into making the print ‘pop’. :smiley: The difference is though that up until the EETF starts it’s blackbox magic, there is a higher chance (I know I am too hung up about this) of having correct colors and luminance relations.

I hope I am not dragging this discussion too much. It is a lot of fun to engage and exchange viewpoints on this matter. So thanks everyone for engaging.

I think you have the right idea but I don’t use DarkTable.

This past week, I’ve been looking at the RIMM workspace and found both a gamma curved ISO2014 and a linear RIMM profile on the ICC website.

In RawTherapee, one can set the camera input profile (even a .icc) then the working color space and finally the output color profile.

I’m not into HDR, so don’t know if this post helps or not …

Sorry for the slow response, I’ve been kind of busy with non-photography projects lately.

The requirements to trigger “HDR Mode” vary depending on the TV - what HDR video formats it supports, and whather the TV/display enforces any “additional” requirements.

For example, Sony TVs will enter HLG HDR mode if input HLG H.265 or H.264 video has the appropriate Alternate Transfer Curve (ATC) SEI metadata. They don’t care about input bit depth (this makes displaying 8-bit H.264 HLG video from Sony’s cameras much easier…).

A Vizio TV requires that the input video be 10 bits/color to enter HLG mode.

As I’ve mentioned elsewhere (including a post that was quoted) - right now the only reliable way to get an HDR display to display a still image in HDR mode is to encode that image to video…

On Win10 there might be some alternative solutions for stills. For Linux, the entire graphics pipeline had massive holes related to HDR displays last year. Among other things, no way to set the appropriate HDMI metadata to tell a display to enter HDR mode (as opposed to interpreting the video data as Rec.709). I saw that Intel was working on upstreaming various patches related to HDR display support a year ago, but I don’t know where those efforts stand.

1 Like