Ok so I get ASC CDL is not only a colorbalance thing. The saturation and especially the fulcrum contrast seem very useful, in order to recover a nice-looking image after the log profile.
In which order is it better to arrange the algorithms ?
In terms of implementation, the official order is as I provided above. Saturation exists separately.
Regarding linear versus nonlinear, the CDL meaning shifts when applied to the different transfer function encoded states.
For example, saturation averaged sums are only an approximation when applied on nonlinearly encoded material versus linearly encoded values.
If the CDL is applied on scene referred data, slope is exposure, and power is contrast. If done on properly encoded log material (with no hacky bits ), offset is exposure, and slope is the contrast control. Etc.
I have added color-pickers to neutralize casts. Basically, once you pick an area, it computes the average color and inverts its hue, so you can revert color casts more efficiently. It doesn’t always get the saturation right though.
I have improved the accuracy of the color neutralization and added an optimizer. When you select 3 samples (black, grey, white), the optimizer tries to fit the CDL curves in order to neutralize the color of the sample patches.
Here is the result with no manual tuning (except for the sample patches selection) :
Hi Aurelien,
having tried your workflow exposure->unbreak input profile->color balance (as explained in your tutorial on darktable.fr) I can say I’m very impressed by its performance and reliability.
Even if automatic mode fails sometimes, the manual mode does the job very well.
However I would like to integrate the camera calibration into that process.
I’ve created a camera profile following the method explained here.
It provides better results than the camera base curve.
But, with your process, adding the color look table preset tends to jeopardize the work already done…
What are your thoughts about this. Do you have any advice ?
Thanks for your feedback ! Today, we have added generic presets to the unbreak profile to fallback when the auto mode fails. They are provided for 8, 10, 12, 14, 16 EV dynamic range cameras.
I have seen the same issue with color lookup. I don’t know yet if the inaccuracies of the LUT are just amplified (it’s really hard to get an evenly-lit chart), or if we need to do the LUT on a log encoded chart picture (which defeats the purpose of the color balance after, since the LUT will then linearize colors). For now, you could try another LUT based on a log image of your chart (tune the settings of the unbreak profile with the mid-grey, white and black patches).
Also, I have discovered just yet that, after log + colorbalance, the global tonemapping with draco method gives much better results, betten than shadows-highlights.
The LUT obtained with this method seems to work well (may be by chance, to be double checked)
Not a manual, but here is a straight forward way to use Aurélien’s tools I’m happy with :
color balance (when possible)
exposure. Center the histogram (set as linear) visually keeping room on both side to avoid clipping (the automatic clipping fails some times).
unbreak. Idem. Keeping grey at 18%, play with black point and dynamic range against histogram still keeping room on both side to avoid clipping. Considering only color picker may be misleading. It happens to have max Lab at 96 and still saturation zones on the image. The histogram is more reliable (and faster).
lookup table with camera profile preset (optional).
color balance (slope offset power). Center the histogram with slope, enlarge it with contrast until before clipping, and balance the image with power. Then tweaks with contrast and SOP if necessary.
No unexpected or weird effect as far I can see. A beautiful (well, … depends ! ) and natural looking image.
Bravo !!!
Unbreak works on the log curve for data sanity and not for visual accuracy.
The grey is a real one and internal data corresponds to a real grey, so the color picker should have a mode (Lab Log ?) showing up actual data (50) instead of displayed data (75).
Does that make sense ?
No. Displayed data is the actual data, if darktable does not apply a silly gamma somewhere (which I suspect, but still no answer on the mailing list).
Log is a way to salvage the dynamic range and squeeze into in what the display has. What we need is a later way to remap the grey and manage the contrast at the extreme values. I’m currently working on that.
… On the colorchecker view, the middle grey patch is really displayed lighter than a real middle grey (just compare with the dt background middle grey). The color picker is not too bad saying 75 as displayed color.
It’s not always clear what the float space should be. As a former 8-bitter, it took me a bit of digging to find out that the imaging convention is 0.0-1.0, because I think the practitioners just assumed we all would know it. G’MIC confused things, as it does floats, but in the 0.0-255.0 range.
8bit / 256 = normalizedfloat, for this discussion, so 75 / 256 = 0.29, or thereabouts…