Mastering workflow for linear images for HDR displays

Excellent. Ironically this means I don’t have to buy 1000 dollar FALD HDR monitors, I just need the stuff that offers DCI-P3 coverage :smiley: (for other people reading the thread, DisplayHDR 500 and above, instead of DisplayHDR 1000, essentially)

I’d like to have both actually. A calibrated sRGB or AdobeRGB display AND the best attempt of the manufacturer of a HDR rec2020-pq/hlg rendition for the wow factor. Of course a calibrated HDR display would be better, no doubt about that, but until then give me the other one.
The video HDR demos at the electronics retailers look mind-blowingly better than sRGB displays. The prices are dropping and photographers can’t even display a static image? Aren’t even Games starting to exist in rec2020-pq?

Shotcut can talk to blackmagic SDI/HDMI cards…since it is based on MLT framework it probably only writes 8bit data into some memory which gets displayed. If colormanagment in wayland is so broken or so much of a chicken-edd situation, how about talking to a decklink for displaying static images?

a) what is the difference in that Windows case between consumption(displaying?) and creation? As I alluded to above, displaying is not the same as color-critical editing/decision-making.
b) do you mean the vesa-HDR vs. HDR10 vs. DolbyVision vs. HDR10+ multitude of ‘fromats’?

Yes, and apparently not, and yes. There are plenty of games that support HDR. Destiny 2 is free to play and supports HDR, and ROTR looks great in HDR.

The problem for implementing this requires cooporation in so many parties it’s not even funny. The viewing of HDR images needs to work on Android and iOS to actually get the ball rolling. What’s the point in making HDR images (which ATM you can’t even do that) if nobody can see them because the SW doesn’t support it?

1 Like

a) Note this is from my understanding and reading some reviews and seems that windows HDR is focused on full screen applications (games and videos mostly), sometimes apparently doesn’t turn on properly and non-hdr content apparently looks awful (again I got this mostly from reading reviews and most of those where gaming focused). Also though not sure if it is still true needed to be turned on at 2 or even 3 places to work (IIRC)
b) That is part of it yes, also the vesa standards that require a table to figure out which is which, unclear marketing, content that is supposedly HDR but when actually analyzed is anything but (and the content is hard to analyze since most of it is behind DRM).

When it works it is amazing (though personally I think most of that comes from the wider gamut not from the wider dynamic range) but there are currently so many hoops to jump through I am still in the wait and see mode myself

1 Like

So Windows10 is not properly color-managing HDR then. Interesting that Krita seems to have implemented something none the less.

I strongly disagree that this is only down to gamut. If you can, try to find two different sRGB displays with different on-off Contrast ratios (say 700:1 and 1100:1 ), calibrate them. Then go through a few hundred photos on flickr. If you can’t get deltaE close between both displays just look at black and white pictures. Even with just 8bit jpegs, the clear winner will be the display with more contrast.

We have a long way to go. :nerd_face:

Never had a fancy TV or monitor. I have always considered contrast ratios, wide gamut and HDR speak to be mostly marketing material. E.g. claimed contrast ratios are often dynamic, not absolute, certainly not per pixel. When you add gamut into the equation, if you wanted to display a series of haldcluts at varying brightnesses spread throughout the screen, how many displays could actually reproduce a fraction of that? Now add motion and transform them into ever changing fractals, etc. Then add windows of apps with different colour management settings on that one display. Change the lighting conditions of your room, turn off the AC. :stuck_out_tongue:

Maybe, if you had one of these…

That is because we humans generally like a more contrasty look, but most of those images won’t be mastered for the high contrast display so they might look good but not as the artist intended.

1 Like

I suspect that is because reality has more than 9.5stops of dynamic range.

my argument was along the lines of: if you don’t have a HDR display to see the difference to a SDR display, take SDR displays to see the different impact of dynamic range within the same small gamut. It seems you agree that contrast/DR is a key component to make something ‘look better’, no?

I am a bit baffled by this. This is easy to measure and is measured by people testing displays. For the subjective quality…aka what you think of this, any electronics store can be enlightening.

true, but again, this can be measured. static on-off contrast, simultaneous contrast (ANSI contrast), and I wouldn’t be surprised if people measure for HDR displays short term peak contrast now. It exists for a reason not as a marketing gimmick.

You lost me there a bit. :slight_smile: All those things happen already. It seems photographers just aren’t aware of this. HDR10 displays, the bog standard one, not the ones with the fancy dynamic metadata for real time gamut remapping(!) (a.k.a. HDR10+ and DolbyVision) look mindblowingly good. I actually want to know how much better a HDR10 mastering display looks…but just a consumer HDR10 is simply amazing.

That is probably not the full story, since AFAIK people also prefer pictures with a higher contrast even when shown on the same display (i.e. when the contrast slider is pushed just a bit further of course can be pushed to far), so it is not only due to higher dynamic range.

but you won’t see the impact of dynamic range since the source image can’t have a wider dynamic range, you just stretch that dynamic range of a larger contrast area probably ruining the artists intent. Sure this might look good to some people but isn’t what you should be looking for when color accuracy is important.

Without exactly knowing what the consumer screens do (it is still a black box color management wise) I wouldn’t be surprised to learn that those screens actually increase contrast and thus a good mastering screen will actually look “worse” due to it being much more color accurate. Also see this twitter thread I linked to earlier.

Honestly, me too… I didn’t think that having more contrast (all else being equal) could not be considered a good thing. But maybe I misunderstood the other posts?

1 Like

Yes, lets say for example that I have a poor contrast (SDR) screen on which I edit my pictures while you have a nice high contrast (SDR) screen, any pictures I send you will look different and in some cases can be totally ruined (vice-versa is also true and might even be worse), this is because in the SDR world there is no encoding for how bright something actually should be. This is one theoretical benefit of HDR encoding since the Electrical Optical Transfer Function (EOTF for short) actually encodes digital input to analog output (e.g. with the pq EOTF 1024 (10 bit pc), 1.0 (float) or 940 (10bit tv legal) is 1000nits per definition)

1 Like

This is a huge not so theoretical benefit over i.e. sRGB actually. It allows for vastly different display tech to be much much closer together in appearance than in sRGB. Even in the case the display has to do internal black box shenanigans when out-of-gamut or out-of-luminance is reached. So right there, ‘HDR’ to throw around that term as loosly as possible, is better.

( I hope I am not being misunderstood, out-of-gamut and out-of-luminance is still a problem then as is could look horrible if the implementation is wrong. All else from the get go is better defined though.)

I agree it is an amazing benefit, I call it theoretical since it assumes that people will actually follow the spec, and we all know how companies (especially in the consumer space) like to skirt those lines a bit (understatement). There are a lot of in theory amazing spec out there that due to being completely abused are not worth the paper they are written on only time well tell if the HDR specs will be among those or not.

1 Like

But that contrast slider manipulates screen space luminance. I am not sure how this is not creating a higher dynamic range presentation…it is.

well, as a photographer I sometimes have an intent how things have to look. If my display is too low contrast in comparison to whatever standard, color accuracy is not helping me. I want to know, and decide, that the luminances that I assign to certain elements in a scene, are looking how I want them to look. The contrast slider is a poor mans mastering tool for that. My camera shoots easily 3 or 4 stops more than I can display. I have to employ DR-compression to get everything into what I want to show. Now there is better display tech solving this to a certain extent. Gaming and Movies are already using that tech, Photographers are very late to that game.

I get the point that you’re making. But as you wrote above, the standard already allows for much less of this fudging blackbox implementation. I read Dan Olsons thread and I partly agree. What his thread boils down to is this wonderfully snarky comment

So standards are being developed for a display tech that isn’t ready, and thus compensation for the not-ready display tech is being baked into the standards, and I’m sure that will have zero downsides in the future

The HDR standards (unfortunately it’s more than one) are the first display standards to allow for display tech to improve while not giving garbage on the way to the ‘perfect’-display. Yes there will be bogus implementations of companies slapping badges on stuff. This was way worse with SDR displays though. Orange reds, yellow greens, 6bit display drivers…If the standard forces everyone to up their game, I am all for it.

I have the feeling that when smartphone photography implements HDR editing of their photos, real photographers will quickly want this too. Another death-blow for traditional camera-makers is waiting.

(Also I think it’s obvious that I want this technology to work, maybe a bit too much actually :smile: )

Not talking about the contrast slider on the screen, talking about the contrast slider in an image editing application.

Gamin is full CG and movies have been moving to a full scene referred workflow for a while now (mostly to make is easier to compose CG elements), both of these things make it much easier to take advantage of HDR tech. Photography - at least as we know it now - having its roots in desktop publishing has a much harder time taking advantage of this tech. On top of that big game studios and movie studios can easily pay for the equipment needed to properly master, which at current price ranges (and needed skill level to properly use) are not readily available to most photographers.

Are you sure because as far as I am aware the in screen black box to has to take in rec.2020-pq/hlg to whatever the screen can do has a lot of leeway in “make it look good” to the viewer. So this

Will definitely still be a thing

And this is I think the real difference between us, I also want the tech to work I am just a lot more pessimistic about the prospects of it ever working as intended in the context of being able to create our own HDR content on consumer devices.

1 Like

@PhotoPhysicsGuy Sorry for the hyperbole earlier. It can be hard to detect. :stuck_out_tongue:

1 Like

Same here. The contrast slider in an image editing app ultimately decides what luminances (luminance difference anchored around whatever middle grey is) will actually be displayed. If implemented crudely it does not process in linear light but something else and thus potentially wrong, but that is another discussion.

Since I dabble in RAW photography (ca.2007) manipulating linear light RAW files is around. darktable has made large pushes to have an all out linear light pipeline (scene referred and not display referred if I am not mistaken). How is it easier for games and movies to take advantage of HDR tech? I am not clear what you mean here. Photographers regularily use exposure bracketing to make *.exr’s to acquire high dynamic range scenes which they then have to master on SDR displays or for print. Clearly the only part missing in the photographers pipeline is displaying HDR content without luminance mapping to SDR. I admit that with competing HDR standards, lack of good enough mastering displays and photographers not even clear about the advantages of an HDR displayed image, this will still take some time. Again, it’s a mystery to me why Krita is further in this respect.

The only thing needed to check is a color-probe to see how accurate a display is for color and brightness. That should give a nice idea about how out-of-display-gamut colors are mapped back into display gamut. Same with luminance. Whatever you need to assess a sRGB display calibration works mostly for HDR displays. On top of that, luminance relations are much better defined for HDR displays. There is some more to it as temporal and local peak brightnesses can be higer, but that’s not voodoo. So yeah, I can buy a completely uncalibrated sRGB display with the wrong gamma, 6bit dithering with a smaller than sRGB gamut. With a HDR display, some of those things are a no-go from the start and can then easily be tested (although probably not corrected). Again, I am sure there will be crap gamut remapping algorithms and crap peak brightness clipping and such. But I would already be quite surprised if deltaE numbers will be as off for in-display-gamut colors as they are for some cheap sRGB monitors nowadays.

There is a nice comparison of a colorist who did some screen test with the apple super duper display vs. a Sony HDR10 mastering OLED I think (spoiler: of course it cannot compete). I’ll see if I can find it. The pessimist in you will love it :smile:

EDIT: there you go! :smile:
https://youtu.be/rtd7UzLJHrU

According to this reviewer, Apple’s new monitor is for lawyers and Youtub-ers, so my hyperbolic statements about HDR marketing weren’t so dramatic after all. :stuck_out_tongue_closed_eyes:

1 Like

In the most used commercial (lightroom) application linear light is still not a thing, in this regard OSS applications are somewhat ahead actually.

Not only have they been using scene linear rendering for some time now (games with physics based lightning, movies for the compositing benefits) they also are mainly output on screens. In contrast photography has focused (and for some people is still the focus of) on making prints (either dedicated photo prints or in books and magazines), which is a notorious low dynamic range medium (lower than even a lot of SDR screens!) So until fairly recently there was no interest in actually outputting HDR and even currently it is mostly a curiosity (especially for people interested in selling prints)

Except that a HDR screen is allowed to use different algorithms depending on the metadata (sure HDR10 doesn’t have dynamic metadata it still has some) so for the above to be valid that must always be set the same for one. Secondly I think you are way to optimistic about these things, if there are cheap shortcuts that still look good to 90% of the people those will be taken and the only way to be sure of what you get would be to bypass the blackbox entirely.

Oof I hadn’t realized a 5k screen would be this bad, sure it has nice sustained peak brightness but the blotchiness of the dark scenes was just awful (ok that is in comparison the the reference monitor, it would probably do pretty well in comparison to other (consumer) HDR screens)

1 Like