Thanks so much for sharing all those details. This sounds interesting and the results look good. I hope you continue to share so we can follow your progress. Likely this work deserves its own thread.
There was a recent play raw with lots of noise…might be a good test image for your approach…
I don’t think a lot of software uses those rgb manipulations anymore you mention. Just saying
“Tone curves” in all raw converters are literally an example though. Pulling the “whites” slider upwards in lightroom creates RGB skewing, as does pulling the “blacks” slider down. Rawtherapee uses an S curve for the base look. It is definitely everywhere.
That’s because they create most of the problem. Default state of RGB processing is high-chroma colors that have high luminance clip to yellow, and people just got used to it.
Reason is, the only high-chroma color in sRGB gamut at high luminance is yellow. Example here with a chroma/hue slice at constant lightness:
Oh, for sure there is software out there doing it wrong. But ‘using better models’ has been in changelogs ever since the end of the nineties, so there is also for sure more software trying to do it differently.
most of the curves in Rawtherapee for instance have (the option for) some sort of ‘preserve chromacity’ andI didn’t touch the curves in a long time in rawtherapee to be honest.
I believe the last (few) process-versions in Lightroom also don’t use the same models anymore for basic white/black/exposure modifications (the curves dialog in ACR still does for ‘compatibility sake’ I think). Colorperfect has been calling for years that they do a better job of preserving true colors (but the plugin is a turd to use for most people :))
And I also mean nothing against your attempt to do it differently. Nothing but respect for people finding different ways of doing things. Like you say, they have to start getting used to more proper ways somehow .
But, with all the background info Aurelien gave on Color Calibration and Filmic in darktable, there is a clear difference between ‘attempting’ and ‘doing it right / achieving it’. The whole ‘phase’ that a lot of programs used LAB space to do adjustments is a good example. His opinions on CIECAM is also a good one :).
There have been attempts to model color behavior, but - specially in the HDR era now - if we actually managed to do it, I’m not knowledgeable enough.
See it more as a sign of “You’re not alone out there trying to do it right”, so keep it up :).
(PS, yellow skies is also a problem of people using lots of saturation and lowering the highlights down ‘because clipping bad’… thinking the yellow color is supposed to be there all the times. Yes, it’s subjective about what you like… but also, people get used to certain looks that isn’t always wanted )
Do you have an idea how many pliers a goldsmith is using? And why?
Highlight reconstruction in the iop early in the pipe and highlight reconstruction in filmic iop at the end of the pipe are operating on different input. So two tools for different use cases.
For the last time (I hope), filmic highlights “reconstruction” is first and foremost aimed at ensuring smooth transition between area that will clipped at filmic’s output and non-clipped areas.
If you set filmic white clipping bound to the same clipping bound as the sensor, then both features may be equivalent (although they certainly don’t work the same).
But nobody says that filmic white exposure should always match the clipping threshold of the sensor “white” (first and foremost because sensors are RGB and know no white). So this highlight handling is completely contextual to what filmic does, regardless to the sensor dynamic range.
The usual highlight reconstruction happens much earlier in the pipe, on non-demosaiced data, and cares only about sensor bounds. Working on non-demosaiced data means it doesn’t see color, but arbitrary RGB plates with holes in them.
Thus these are not the same feature. They become equivalent, feature-wise, only if you willingly set filmic white exposure to the sensor scene-referred clipping value, which is a special case.
This I finally realized the other day! Highlight smoothing may be a less misleading term for describing what it actually does?
Still longing for better true reconstruction in darktable. When “reconstruct color” in the “highlight reconstruction” works (not often) I’m happy, but it mostly leads to crazy color shifts. I’m sure it can’t be as easy as borrowing some reconstruction code from RawTherapee or here?
Sometimes when reconstructing highlights (Highlights reconstruction + reconstruction in Filmic) I get strange horizontal lines in sky. Probably in burnt out parts. Did anybody experience something similar? I can add some screenshots later on if needed.
Yes I get that sometimes with colour mode, doesn’t seem to be much logic to when it appears. LCh is safer if colour mode generates these strange lines.
Sometimes? I’d estimate 40% of the images with clipped highlights for my α7 III. Not sure were talking about the same artifacts, but I’m getting very unflattering and strange purple patterns, not only horizontal. Should probably dig out an example.
You are right. I looked again at what which module does for my photos and now can see that it is Highlights reconstruction in colour mode causing the trouble. Still, if there are no artefacts this mode produces much better results for me than Lch. I try to remove the patterns generated by the colour mode thanks to Filmic highlights reconstruction and sometimes I succeed.
I wonder if the colour mode in Highlights reconstruction could be improved so there would be no such artefacts.