Mastering workflow for linear images for HDR displays

This Github gave me some insights into that world.

All in all. The thrust of my posts have been to promote the idea that while waiting for HDR to be common we can develop good habits in photo processing and management.

PS Krita, not a raw processor but a raster editor, does support OCIO, so it isn’t a stretch that stills editors could adopt stuff from the video world.

speaking of ACES: HDR, ACES and the Digital Photographer 2.0

And yes, Krita is like the only thing that supports “real” scene-referred pixels well. Unfortunately I think using that for photography is not a great idea, just because t he tools aren’t there.

GitHub - ampas/rawtoaces: RAW to ACES Utility wonder what this is?

Scene Linear Painting — Krita Manual 4.4.0 documentation Here’s some Krita documentation on scene-referred (they call it scene-linear, and used to call it HDR, lol) painting.

Indeed though I need to stress that though the process is simple (for video at least) the hardware requirements to pull it of are anything but - remember that 5k screen from Apple? that is cheap for a proper reference monitor, usually those start at 10k. Sure not everyone in the production will need one of those only the colorist for the final grade still the other people are probably using expensive calibrated monitors (price range in the 1k to 2k per monitor)

I fully agree with this, which is why I have been hammering that the color management in wayland needs to be top notch and preferably designed from the ground up to be able to include HDR imagery.

Krita is the farther here and on windows does support HDR output, but then you run into that a) windows and HDR is a bit of crapshoot still (it is designed for consumption not creation for one thing) and b) consumer hdr is a bit of mess right now

1 Like

Yeah, I am actually thinking of getting that if I can find a good discount for it, haha
I wanted to try one out at an Apple store but well, the world’s gone to shit at the moment

But one can get decent HDR monitors for <1k with great HDR, not to mention a LOT of people have devices with HDR functionality (like, pretty sure all the new iPhones, Galaxy S9? above, iPad, macbook, LG OLED TVs, samsung QLEDs,… ) that I kind of wanted to tap into, just to see how things are. And according to rumors, the new Macbook might have microLED displays which will definitely hit crazy nits and have a ridiculous color gamut, possibly wider than ProDisplay XDR…

Not saying I’m gonna edit darktable on my phone or something, but still, it would be awesome to see crazy cool photos on these devices.

and from what I hear apparently upstream X.org and wayland have no idea how color management works and it’s an absolute shitshow right now. Not a good sign :frowning:

1 Like

Yes but almost all of those are consumer focused and do internal tone- and gamut- mapping which for creation is something you don’t want since it means loss of control. (current exception in the above list ar the Apple displays probably we don’t know what tech Apple is using exactly and any screen that supports Freesync2 but only in specific display modes)

For modern display techniques X.org is a lost cause, we are trying to work with wayland but run into a chicken and egg problem there, no apps interested in going to wayland yet since there is no color management (in X it worked cause we could bypass X sort off which is not possible in wayland) and no interest in implementing a color management protocol since there are no applications. For HDR we also run into the problem that all current HDR standards are consumer ones that assume you just push rec2020-pq/hlg to the screen and let the screen handle the rest (as a black box) so currently no way to properly profile or calibrate such screens (even if there is a protocol) with the exception of again the high-end monitors (which often use in build luts or lut boxes but do have support to change those luts making calibration possible)

1 Like

Lovely, looks like I just need to check out and look what’s happening 5 years later…
Hopefully Apple pushes the envelope by being first to market with a non-stupid microLED display then…

Indeed, there doesn’t seem to be an easy way to switch the HDR TV to the HDR mode when watching stills. E.g. the new R5 captures HEIF that can only be displayed from the camera over HDMI: https://www.google.com/amp/s/www.dpreview.com/reviews/canon-eos-r5-initial-review.amp

The stills ecosystem is behind the video workflow when it comes to HDR unfortunately, hopefully it’ll catch up soon with more and more cameras capturing HDR directly, with a push from Apple and Adobe…

1 Like

That’s an endeavor to develop camera characterizations for ACES Input Device Transforms (IDTs). In this thread, you’re picking at the other end, Output Device Transforms (ODTs), in ACES terms…

I’m not so sure HDR for video is even well-corralled. It has the same fundamental problem as still media, in that how the rendition of a file on all the possible screens cannot be controlled by the media creator in all its incarnations. To avoid all that, still photography came to rely on sRGB as a device “assumption”; now that we have better displays, we don’t have the intermediate exchange of information needed to handle both them and the old technology.

In FOSS software, there’s movement to keeping data scene-referred until the act of display or export, darktable being at the forefront (rawproc is further along, but nobody but me uses that… :smiley: ). It’s that “export” thing that’ll vex taking advantage of better displays, until such time as display color management becomes transparent to all, incluing my mom…

Not sRGB exactly though that is the right output for the internet (currently hopefully that will change in the future).

History lenses time:
Digital photography as we currently know it has its origin in desktop publishing which invented ICC profiles to deal with a wide variety of (potentially cheap!) output/input devices (printers (including offset),scanners, displays, etc). In contrast the film industry having only to deal with a few input/output devices and those being a drop in the bucket in comparison to the real costs of making a movie just went for the expensive option set them up once and called it a day. Later with digital composition/editing and VFX they realized they needed a standardized workflow so they developed ACES, which still assumes you are using a screen that is not only properly setup but can display the selected output gamut 100% or nearly so (though that is probably going to change when OCIO v2 is coming out hopefully later this year since that should have ICC support for the output display).

The biggest problem in HDR production on the cheap currently is that there is no standard way to describe the output (like an ICC profile), there are some potential candidates (like ICCmax) for this though. (The other big problem is that consumer screens have carte blanch in converting the input rec202-pq/-hlg to the actual display with a mandate to “make it look good”)

1 Like

The so-called HDR devices have a good wow factor but I would rather have a boring yet cooperative and accurate display. (Coming from someone who doesn’t do CMS because the screens are ugly and defective anyway. :roll_of_toilet_paper::roll_eyes:)

PS That is how sales works. Every TV and monitor has a showroom mode where everything is turned to max and then some, and the space and lighting around the product is fabulous too. :sparkles:

All of this is bad news, but at least using a wider color space will actually work decently well… right?
As opposed to triggering the wide dynamic range mode in monitors.

Yes, what I have seen many HDR monitors (at least the once for PCs) also offer a wide gamut mode (non-HDR) do note that in that mode anything not being color corrected (e.g. user interface parts) will look off (at least on Windows and Linux, Mac is a lot better in this regard) since there is no full screen color management. That said all serious image editors/viewers (Krita, Gimp, Darktable, Rawtherapee, etc, etc) are color managed so the photos/image you work on should look correct!

Excellent. Ironically this means I don’t have to buy 1000 dollar FALD HDR monitors, I just need the stuff that offers DCI-P3 coverage :smiley: (for other people reading the thread, DisplayHDR 500 and above, instead of DisplayHDR 1000, essentially)

I’d like to have both actually. A calibrated sRGB or AdobeRGB display AND the best attempt of the manufacturer of a HDR rec2020-pq/hlg rendition for the wow factor. Of course a calibrated HDR display would be better, no doubt about that, but until then give me the other one.
The video HDR demos at the electronics retailers look mind-blowingly better than sRGB displays. The prices are dropping and photographers can’t even display a static image? Aren’t even Games starting to exist in rec2020-pq?

Shotcut can talk to blackmagic SDI/HDMI cards…since it is based on MLT framework it probably only writes 8bit data into some memory which gets displayed. If colormanagment in wayland is so broken or so much of a chicken-edd situation, how about talking to a decklink for displaying static images?

a) what is the difference in that Windows case between consumption(displaying?) and creation? As I alluded to above, displaying is not the same as color-critical editing/decision-making.
b) do you mean the vesa-HDR vs. HDR10 vs. DolbyVision vs. HDR10+ multitude of ‘fromats’?

Yes, and apparently not, and yes. There are plenty of games that support HDR. Destiny 2 is free to play and supports HDR, and ROTR looks great in HDR.

The problem for implementing this requires cooporation in so many parties it’s not even funny. The viewing of HDR images needs to work on Android and iOS to actually get the ball rolling. What’s the point in making HDR images (which ATM you can’t even do that) if nobody can see them because the SW doesn’t support it?

1 Like

a) Note this is from my understanding and reading some reviews and seems that windows HDR is focused on full screen applications (games and videos mostly), sometimes apparently doesn’t turn on properly and non-hdr content apparently looks awful (again I got this mostly from reading reviews and most of those where gaming focused). Also though not sure if it is still true needed to be turned on at 2 or even 3 places to work (IIRC)
b) That is part of it yes, also the vesa standards that require a table to figure out which is which, unclear marketing, content that is supposedly HDR but when actually analyzed is anything but (and the content is hard to analyze since most of it is behind DRM).

When it works it is amazing (though personally I think most of that comes from the wider gamut not from the wider dynamic range) but there are currently so many hoops to jump through I am still in the wait and see mode myself

1 Like

So Windows10 is not properly color-managing HDR then. Interesting that Krita seems to have implemented something none the less.

I strongly disagree that this is only down to gamut. If you can, try to find two different sRGB displays with different on-off Contrast ratios (say 700:1 and 1100:1 ), calibrate them. Then go through a few hundred photos on flickr. If you can’t get deltaE close between both displays just look at black and white pictures. Even with just 8bit jpegs, the clear winner will be the display with more contrast.

We have a long way to go. :nerd_face:

Never had a fancy TV or monitor. I have always considered contrast ratios, wide gamut and HDR speak to be mostly marketing material. E.g. claimed contrast ratios are often dynamic, not absolute, certainly not per pixel. When you add gamut into the equation, if you wanted to display a series of haldcluts at varying brightnesses spread throughout the screen, how many displays could actually reproduce a fraction of that? Now add motion and transform them into ever changing fractals, etc. Then add windows of apps with different colour management settings on that one display. Change the lighting conditions of your room, turn off the AC. :stuck_out_tongue:

Maybe, if you had one of these…

That is because we humans generally like a more contrasty look, but most of those images won’t be mastered for the high contrast display so they might look good but not as the artist intended.

1 Like

I suspect that is because reality has more than 9.5stops of dynamic range.

my argument was along the lines of: if you don’t have a HDR display to see the difference to a SDR display, take SDR displays to see the different impact of dynamic range within the same small gamut. It seems you agree that contrast/DR is a key component to make something ‘look better’, no?