Color calibration for dummies (without the maths)

Hello Aurelien,

My point is general, not specific to a particular piece of software (plus I am not a coder): long ago I took a raw capture of a CC24, white balanced the raw data by multiplying it with diag(WB_mult), converted to RGB by keeping the R and B channels as-is and averaging the greens of each quartet, then fed them to the optimization routine to obtain CCM(1); then I repeated the process but without the white balancing step, to obtain CCM(2). For practical purposes CCM(2) was equal to CCM(1)*diag(WB_mult), as theory suggests.

I don’t remember if I used Matlab’s fminsearch (Nelder-Mead) or lsqnonlin (trust-region-reflective) for a solver, with CIEDE2000 as the minimization criterion.

When theory meets c programming practice, practice wins :wink:

2 Likes

plus ça change, plus c’est la mĂȘme chose :slight_smile:

It would be possible to add a mixed white balance, 50% CAT and 50% native rgb?
This should gives a more robust white balance

That’s possible - color calibration alows masking.

1 Like

Thanks aurelienpierre for this new feature. I used it together with Spydercheckr to replicate some paintings and that I needed matching colours for. And thanks johnny-bit for recommending Spydercheckr for me.

Anyone tried this since 4.0.0? Normalization values don’t move anymore, even if I increase the slider in the exposure module.

If your saying what I think your saying it changed at some point and the values given are the values that you should set for exposure and black level to be accurate. it doesn t change like before, ie its not an offset anymore but the actual value for exposure.

No difference in the 3.8 manual from the 4.0 manual. Has this been mentioned at Github?

Ya I will try to dig it up and I think it is mentioned in one of AP’s videos
maybe the one called something like getting the most out of color calibration or something like that


Here you go
took me a moment


Thanks!

1 Like

This is an old thread. I am curious how much of it holds true today.

  • As per AP - originally calibration is done on 2 steps - the old WB (White Balance) module and the new Color Calibration. I am trying to understand the manual darktable 4.4 user manual - color calibration but I am not perceiving that WB module is used anymore (in terms - not as a color picker) - it is left on the reference setting. Am I perceiving it correctly?

  • My understanding is that the “Normalization values” a guidance for the exposure and black level correction. And the user is guided how to change them. With few trials - I was able to adjust “black offset” to zero but even with big offsets - I was unable to make “exposure compensation” zero. Is the user simply expected to put the number written (and not to expect that it will change to zero when the profile is re calculated)?

  • To have a fairly universal profile - the suggestion is to create a preset that is based on the “as shot in camera” after applying the profile so the “matrix adaptation space” is applied on top of it.

On initial thought I understand why this would be the case. But then this is going to take the WB as recorded by the camera. And it can vary - Auto WB or fixed (by measurement - graycard or by estimate. Suppose the user used Auto WB - is the user expected to further estimate the WB by measuring the scene?

Also - the universal profile is based on natural light (good quality). Is there any difference when a profile is being created if the test shot is done using direct sunlight or cloudy or overcast?

What is the approach going to be if the shot is done using artificial light (like fluorescent / energy savers etc. not photo grade) - is the user expected to change back to “as shot in camera” or this would be not a needed step?

There are 3 icons at the bottom right of the color calibration
image
The re calculate and apply are self explanatory but what is “check output delta E” used for?

Very quick replies


Point 1 yes leave WB on reference if using CC module for WB/illuminant selection

Point 2 Initially those values were offsets to add to existing exposure etc
this was later changed
 those are the values at which the profile is accurate as reported so you would enter them


Point 2 
As shot is just really what ever value the camera used for the shot and is passed to DT. Your matrix will be added to that
 The profile will be a varying value at any lighting other than when the shot was taken
 the closest to a general use is a daylight shot profile saved to apply to the as-shot camera
 Its set to as shot in the preset so that it only applies the matrix to your image and the current wb and not some totally wrong value encoded at the time of the preset creation


That button I believe is a check is so you can see where your current profile deltaE is when you start and then you can see from there where you move to


I believe this is accurate
 There is a tweak to this in the recent code allowing some modules to get access to the wb coefficients as that is a better set of reference values for them and the D65 part is now handled a little differently but that is all in the background


1 Like

My experience is that these values move (the normalization value).
However - the black offset can be driven to zero while the exposure compensation cannot.

Unless - I am misunderstanding and I should simply apply the values initially seen in the exposure module and then either not re calculate or re calculate but ignore the second set of values (because re calculation makes the values move).

I am finding that it is possible to create a style that is based on “as shot” but further refine it based on the white card. So - if the user has a style then they can use the white card after applying the style. This can be useful as the style can take a while to prepare.

https://www.datacolor.com/spyder/products/spyder-checkr-photo-sck310/
I was surprised to learn that the 18% card is for film exposure (on the info section).

Another interesting detail is that these patches are expected to last about 2y. My guess is that this would be of bigger value for somebody that uses it commercially.

It is a pleasant surprise that the tool is listed in the module so all the patches can be used together.
And about price - it is cheaper than xrite/calibrite.

I believe it’s using the tonal patches to determine that exposure value it gives. I can look up the reference but it basically means this is what you should set your exposure to to be correct. The compensation one always jumps around and overall I think AP said not to keep trying to zero it. Of course the exposure changes scene to scene as well so its just another variable to manage if you are hoping for a “universal” profile. It may be useful to try this approach but really I think at times you might just introduce a further change to the scene. I think it’s a great feature to use on a shot where you have a color checker in the shot and then edit a set of images in similar lighting
 as for a global or universal profile I’m less confident of the overall value

1 Like

I might try playing with this a bit more.

It is a bit of a catch up game. I am quite behind on some editing. So - having a universal profile helps (with the older photos). I don’t think it is the solution for everything but in my case I can use some nudge in the right direction.

I am finding that in some cases everything works well to my taste without too much fiddling with the tools - just overall assessment with the CC module and it is good. On other cases it just doesn’t work well for me and the lack of neutral colors in the shot doesn’t help.

It is a brand new tool for me - I think there will be some learning curve. Time will tell if it was money well spent or maybe too much for a hobbyist. At least it was on sale :slight_smile:

1 Like

You will never have an “universal” correction profile, for the following reason: every pixel recorded on your camera sensor will have a value

v_C = \int f(\lambda) s_C(\lambda) \mathrm{d}\lambda

where f(\lambda) is the spectral radience by wavelength \lambda and s_C(\lambda) is the sensitivity of your camera’s sensor to color channel C, we can think of C = R, G, B. It is important to note that f is an infinite-dimensional object (a spectrum) and your camera compresses it to 3 scalars (actually, 1 scalar at each point unless you have a Foveon sensor, so on a Bayer sensor it is interpolated).

Information is lost, you will never get it back. This would not matter if the s_C matched the sensitivity of the cells in your eye (the Luther-Maxwell-Ives condition), but they don’t. All the color in an image is essentially guesswork.

One approach you could take is a linear correction with a matrix. Specifically, take a bunch of colors i = 1, \dots, N in a test chart, record their properties, eg in sRGB space, then try to find a matrix A so that

\begin{bmatrix} r_i \\ g _i \\ b_i \end{bmatrix} \approx A \begin{bmatrix} \tilde{r}_i \\ \tilde{g} _i \\ \tilde{b}_i \end{bmatrix}

is minimized, where r_i etc are the “known” color values and \tilde{r}_i etc are the sensor recordings. This you can do with an iterative method, or least squares. But it is important to understand that this is an approximate correction and comes with no guarantees. Again, information has been lost, and there is no way you can recover it.

It is tempting to imagine that you can decompose A = L B to part B that is independent of white balance, and L that is diagonal and stands for the white balance correction. But unfortunately it does not work that way, as the illuminant factors into the f above.

So, what do you do in practice, especially if you have access to the wonderful Darktable 4.6?

  1. Make sure that your monitor has decent colors, ideally calibrated, but a lot of them are OK out of the box these days. Set up your OS accordingly.
  2. Forget the color checker unless you are especially required to match exact colors (eg product photography or similar). It is impractical and unnecessary for a hobbyist, and most color checkers have colors well inside the gamut.
  3. Learn to use color calibration and its cousin rgb primaries, and fine tune with color balance rgb. Try one instance for practice, but accept the fact that in the general case, you will not be able to correct the colors with just one instance of any of these modules (you need to explicitly or implicitly mask areas, eg using color balance rgb). Practice. It may take hours of fiddling on one photo, but it will be faster for the next, and even faster afterwards.
  4. Learn to use presets, save and reuse various corrections, especially for a batch of photos taken under similar conditions.

Yes, this is daunting, but it is a skill that digital photographers need. A huge investment as the tools and the theory are not trivial, but it pays off. @s7habo has amazing videos starting at

2 Likes

I am not sure I can quite agree with this.
For simplicity - I am going to ignore the math. Reason - I can’t even pretend to understand it - so commenting on it is not possible for me. So - I will stick to what I can see.

I agree - the best that we can do is calibrate. If we don’t we are leaving it at the mercy of the default profile and the manufacturer’s settings.

I don’t understand why.

The following is not processed (except the default enabled profiles). The only difference is

  • left - calibration with color checker and left with “as shot in camera”

  • right - no modifications done - no color checker used but left again with “as shot in camera”

It is completely debatable what is better - left or right but the point that I am trying to make is that they are different.

What @s7habo shows is a creative use for the colors. The camera calibration is just a tool - a starting point. It is never meant to replace the creativity.

There was an interesting discussion with examples here

And specifically the challenge that @s7habo had to deal with - specifically big amount of pictures.

I am often facing a similar challenge. Big amount of pictures (family photos) and I am trying to process them with minimum amount of masking. If I must use some - I try to stick with parametric only.

This is where the “universal” profile comes to play. I find it helpful to have a bit better starting point and then build on it. Sometimes I build more because I like the picture that I am working on and other times I build less - when the picture is more of a good memory but not necessarily something that has significance in somebody else’s eyes except myself and my family.

Finally - can we go without the color checker - yes - most of us have done it for years.

Should we ignore it / not consider it etc.? It depends on the case and on the photographer. I wouldn’t “ignore” it but there are times where I would consider it more than others. Frankly - it is a newer addition for me so me exposure to it is limited. Probably I wouldn’t have bought it except it was on sale.

Is the experience better with it? I think so, I hope so. Ask me again in a few years :slight_smile:

1 Like

Ignore the scary symbols. The idea is like this: I know I have ten numbers that summed equal 100, can I get the original numbers? No, because there are many possibilities for those numbers. That’s what @Tamas_Papp was referring to when he says you lose information, you can’t retrieve the original captured light spectrum from the R, G, B values captured by the camera sensor.

2 Likes