darktable 3.0 for dummies: hardcore edition

Ok, the blue light channel mixer thing is amazing. I have a concert photo (that I will probably make a PlayRaw of at some point) with very bright blue light, that Lightroom just, I don’t know, dealt with… It made it work and I havent been able to do the same with darktable.
I have toyed around with putting the channel mixer in the pipeline before the input profile. It felt like a very ugly hack, but it did work a bit. And now with these values it’s perfect!

Of course, Lightroom has sliders in the ‘camera calibration’ tool that allow you to adjust the hue/saturation of the primaries. I think this is more or less the dt equivalent of that.

(on a related note: they should make LED theatre lighting illegal.)

1 Like

I have also experienced problems with blue spotlights in darktable. I have found that in Rawtherapee, if you use the dcp profile for the camera, the picture ‘just works’ as in Lightroom (although I don’t have a copy of this to try) but if you use the standard camera profile, you experience the same kind of strange problems with colours as in darktable. I am, guessing that if the colour profile is slightly out then colour extremes go out of gamut and so the channel mixer needs tinkering with. Just my guess, somebody may have a better explanation.

what camera is that? maybe RT just lacks a proper DCP profile for it.

Panasonic Lx100. There isn’t a dcp profile listed in RT so I stole a dcp profile from Adobe dng converter to get colours that look like the OOC jpeg. The RT standard ones don’t look too bad though slightly duller, sometimes I prefer them.

We still need to deal with those out of gamut blues. There is a trick I took from theACES , which involves applying a custom RGB matrix to force those blues back. Unfortunately, the ACES coefficients expect an ACES P1 colour space, and darktable pipeline works in Rec 2020 by default, so the coeff will need to be adjusted.

The ACES Blue Light LMT is meant to be applied to AP0. Not AP1. It comes from this thread: Colour artefacts or breakup using ACES - Post (DI, Edit, Mastering) - Community - ACESCentral

1 Like

Should we add this as a preset ?

8 Likes

I think so (but anyway, I made my own preset already).

Right. I know AP1 is very close to Rec2020, but I never found the actual primaries (and the white point is D60, I think). If anyone can find me the RGB ↔ XYZ primaries, then it’s just a matter of adapting the coeff with a bit of linear algebra.

We need to unbound the channel mixer first. It’s very bad to clip in [0 ; 1]

rtengine/iccmatrices.h in RT’s source code

1 Like

I just pulled DT today and your “for Dummies” posts are great - you get great results and you’re able to explain your process. QUESTION: Is filmic another downlod that I’m going to want?

All modules are included within darktable itself. Check your version - if you have 3.0, you should have module named “filmic rgb”.

Thank you @anon41087856 for posting the article. I got pretty good results following your recommendations on challenging images without humans in them. However I find it hard to do not let skin tones ‘blend in’ if there are people in such pictures. I will keep practicing though. Thanks again.:+1:

1 Like

Don’t hesitate to use masks to isolate people in difficult situations.

3 Likes

Thanks you @anon41087856. Is it possible to have a link to the raw file to practice ? Very nice job. Thanks a lot.

1 Like

Thanks a lot. Tried it and got great results instantly. Learned more about dt in a few minutes than reading through the tutorials for weeks.

@anon41087856 I didn’t quite follow how you derived those adjusted color mixer coefficients. You say the correction matrix assumes a AP-1 colour space. So, to derive a set of coefficients suitable for REC2020 colour-space, we need to transform from REC2020-to-AP1, apply the correction matrix, then transform from AP1-to-REC2020 again, right?

In that ACES code, they supply a matrix to go from AP1-to-XYZ:

0.6624541811 0.1340042065 0.156187687
0.2722287168 0.6740817658 0.0536895174
-0.0055746495 0.0040607335 1.0103391003

and a matrix to go from XYZ to REC2020:

1.716651188 -0.3556707838 -0.2533662814
-0.6666843518 1.6164812366 0.0157685458
0.0176398574 -0.0427706133 0.9421031212

So, to go from AP1-to-REC2020 we multiply [XYZ-to-REC2020] x [AP1-to-XYZ]:

AP1 to REC2020
1.04179138411768 -0.010741562648803 -0.006961875092012
-0.001683127668918 1.00036605066315 -0.001408211010916
-0.005209686580316 -0.022641445739242 0.952302414802367

and we invert that matrix to go from REC2020-to-AP1:

REC2020 to AP1
0.959937154908178 0.01046663483592 0.00703316687532
0.001622552328432 0.999685232225389 0.001490139829589
0.005290030316074 0.023825253906876 1.05016049940683

So, if the blue-correction matrix is:

correct blue AP1 space
0.9404372683 0.0083786969 0.0005471261
-0.0183068787 0.8286599939 -0.0008833746
0.0778696104 0.1629613092 1.0003362486

then the matrix to correct blue in REC2020 space should be [AP1-to-REC2020] x [correct-blue-AP1-space] x [REC2020-to-AP1]:

correct blue REC2020
0.94 0.009 0
-0.018 0.828 -0.001
0.072 0.16 1.001

But this is significantly different to the coefficients you gave:

Aurélien’s magic matrix
1.00 -0.18 0.18
-0.20 1.00 0.20
0.05 -0.05 1.00

So where did I mess up?

1 Like

The method is quite simple:

  1. input the ACES coeffs,
  2. notice they don’t look good,
  3. tweak them a little bit,
  4. iterate until it looks better,
  5. write them down with some technical words so everybody thinks you are a great color scientist, when in fact you are just a decent retoucher,
  6. wait for someone to fact-check and confess.

:slight_smile:

11 Likes

Lol I suspected there may have been some “perceptual optimisation” procedure based on the “convenient” values in the magic matrix, but I was curious to see what the raw matrix looked like using the published transforms. As you say, the result doesn’t look too good based on the raw calculation, and I wasn’t sure if I had misunderstood something and/or made some bad assumptions :slight_smile:

@anon41087856 it took me quite some time to fully undertand, how the things in your steps 5.1 to 5.3 worked.

After I got it, I kinda like this function also in normal daylight (indeed not always).

As the nature of the beast, those colour pickers/eyedroppers they interact on each other, so one end up clicking them several times after zones/boxes are marked (until sliders stop moving)

Would it be a good idea to have an “iterate hue eyedropper” button which iterates as often as the sliders would adjust (limited to ‘n’ iterations for safety)?

To do an automatic iteration, you need a fixed goal. But in Aurelien’s step 5 you refer to (the colour balance module), the iterative part is “until it looks good”, so that’s going to be very hard to translate into a fixed target, valid for every image you want to use it on.

From later posts, I get that the same thing holds for the channel mixer: iterate until “it looks good”.

(Note: I do not want to imply that “adjusting until it looks good” is bad practice somehow, just that it’s an impossible target for automated iteration)

1 Like