We still need to deal with those out of gamut blues. There is a trick I took from theACES , which involves applying a custom RGB matrix to force those blues back. Unfortunately, the ACES coefficients expect an ACES P1 colour space, and darktable pipeline works in Rec 2020 by default, so the coeff will need to be adjusted.
Right. I know AP1 is very close to Rec2020, but I never found the actual primaries (and the white point is D60, I think). If anyone can find me the RGB ↔ XYZ primaries, then it’s just a matter of adapting the coeff with a bit of linear algebra.
We need to unbound the channel mixer first. It’s very bad to clip in [0 ; 1]
I just pulled DT today and your “for Dummies” posts are great - you get great results and you’re able to explain your process. QUESTION: Is filmic another downlod that I’m going to want?
Thank you @anon41087856 for posting the article. I got pretty good results following your recommendations on challenging images without humans in them. However I find it hard to do not let skin tones ‘blend in’ if there are people in such pictures. I will keep practicing though. Thanks again.
@anon41087856 I didn’t quite follow how you derived those adjusted color mixer coefficients. You say the correction matrix assumes a AP-1 colour space. So, to derive a set of coefficients suitable for REC2020 colour-space, we need to transform from REC2020-to-AP1, apply the correction matrix, then transform from AP1-to-REC2020 again, right?
In that ACES code, they supply a matrix to go from AP1-to-XYZ:
0.6624541811
0.1340042065
0.156187687
0.2722287168
0.6740817658
0.0536895174
-0.0055746495
0.0040607335
1.0103391003
and a matrix to go from XYZ to REC2020:
1.716651188
-0.3556707838
-0.2533662814
-0.6666843518
1.6164812366
0.0157685458
0.0176398574
-0.0427706133
0.9421031212
So, to go from AP1-to-REC2020 we multiply [XYZ-to-REC2020] x [AP1-to-XYZ]:
AP1
to
REC2020
1.04179138411768
-0.010741562648803
-0.006961875092012
-0.001683127668918
1.00036605066315
-0.001408211010916
-0.005209686580316
-0.022641445739242
0.952302414802367
and we invert that matrix to go from REC2020-to-AP1:
REC2020
to
AP1
0.959937154908178
0.01046663483592
0.00703316687532
0.001622552328432
0.999685232225389
0.001490139829589
0.005290030316074
0.023825253906876
1.05016049940683
So, if the blue-correction matrix is:
correct
blue
AP1 space
0.9404372683
0.0083786969
0.0005471261
-0.0183068787
0.8286599939
-0.0008833746
0.0778696104
0.1629613092
1.0003362486
then the matrix to correct blue in REC2020 space should be [AP1-to-REC2020] x [correct-blue-AP1-space] x [REC2020-to-AP1]:
correct
blue
REC2020
0.94
0.009
0
-0.018
0.828
-0.001
0.072
0.16
1.001
But this is significantly different to the coefficients you gave:
Lol I suspected there may have been some “perceptual optimisation” procedure based on the “convenient” values in the magic matrix, but I was curious to see what the raw matrix looked like using the published transforms. As you say, the result doesn’t look too good based on the raw calculation, and I wasn’t sure if I had misunderstood something and/or made some bad assumptions
@anon41087856 it took me quite some time to fully undertand, how the things in your steps 5.1 to 5.3 worked.
After I got it, I kinda like this function also in normal daylight (indeed not always).
As the nature of the beast, those colour pickers/eyedroppers they interact on each other, so one end up clicking them several times after zones/boxes are marked (until sliders stop moving)
Would it be a good idea to have an “iterate hue eyedropper” button which iterates as often as the sliders would adjust (limited to ‘n’ iterations for safety)?
To do an automatic iteration, you need a fixed goal. But in Aurelien’s step 5 you refer to (the colour balance module), the iterative part is “until it looks good”, so that’s going to be very hard to translate into a fixed target, valid for every image you want to use it on.
From later posts, I get that the same thing holds for the channel mixer: iterate until “it looks good”.
(Note: I do not want to imply that “adjusting until it looks good” is bad practice somehow, just that it’s an impossible target for automated iteration)
@revietor
you mixed things. In the comments Aurelien talks about “iterate until it looks better”, he was referring to Step6 (gammut / ACES / Channel Mixer) not 5.1 to 5.3 (Color balance, eyedropper)
In a daylight scene, sometimes I use eyedropper (of color balance hue) and point with highlight to a highlight scene and power to a mid-tone.
imagine highlight catches a blue tone, it will give you an orange as compensation plus some saturation value
imagine further (and I have such scenarios), medium/power cathes some green (pointing so something greyish), it will give you red + saturation
Now after that step (for simplification we skip shadows), click the eyedropper again, you will see, it reacts on the adaptations happened from the mid-tones and vice versa.
Here I click several times the eyedroppers forth and back until they stop moving. And finally have a nicely balanced colour…
This can be iterated in my opinion, as I do it manually (and getting a “job” once you also do it for the shadows).
from what I saw on Bruce’s video about colour balance, I did not think even, that could be it, as in his video it did not revert his manual settings for blue&orange look