Best practice for "unexpected white balance coefficients" in darktable

I did a little more tweaking. Not that different really. Very nice image!


20221106_0058_01.ARW.xmp (20.1 KB)

1 Like

It took me some time to accept V6 as some control seemed lost. With most images this is not a problem, but the image we have played with here demonstrates that it can be a problem. I am glad you have inspired me to revisit V5.

Here is my attempt at your second shot. I focused on retrieving the blown highlights. I did this by using multiple instances of exposure and limiting darkening to highlights with drawn and parametric masks. Fun image to play with.


20221106_0058.ARW.xmp (14.6 KB)

2 Likes

Today, we have only ourselves to blame, when our images are underexposed. Correct exposure in these days of filmic/sigmoid is an absolute game changer, at least it is for my craft.

Thank you all for a very insightful thread - with some great images :slight_smile:

I’ve been pulling my hair out trying to get correct colours in Darktable from my Sony A7IV raw files. I recently moved from a Nikon D750 and never noticed any colour problems in DT. I thought I’d made a mistake buying the otherwise brilliant Sony.

My issue was that the raw colours in DT were often too warm, and skin tones a nasty grey / green. I did some colour testing at a Winslow Homer painting exhibition recently. The OOC JPEGs were definitely more colour accurate (whilst looking at the actual paintings) than DT modern workflow when first opening the raw files later.

My issue is not that the raws in DT did not matched the JPEG though! It’s that the raws in DT did not match the actual colours of the real pictures as I observed them at the time (not from memory).

I love DT, and appreciate its contributors, but I cannot comprehend why it should be comfortable (if it is) starting with the wrong colours - on the basis that we have the creative power to change them. Sure, I want to be able to tweak the colours in some pictures… but I don’t want to be forced to reconstruct a true colour starting point from memory for every image! I can only assume that colour accuracy must not be so obvious a problem for other brands or cameras.

After too much time researching, I discovered that using:

Color balance RGB / basic colourfulness: standard preset (right click to access)

… gave the best starting point colours when using the modern workflow (i.e. colour calibration module for white balance). Modern is recommended by the DT team for correct scene-referred capability - for reasons few photographers should have to think about!

Thanks to your above contributions above I’ve learned that the color callibration model is probably not handling the Sony raw white balance / colours correctly. Turning off modern and using legacy white balance ‘as shot’ does indeed give more accurate starting colours! And using the ‘standard’ colour preset above makes many of my images even better.

To conclude, I think DT needs to get accurate OOC raw colours for the Sony A7IV using the modern workflow if that is the future. If this is problematic, then using the white balance module ‘as shot’ should not be positioned as ‘legacy’ when it works better!

I tried Sigmoid but it looked worse than Filmic RGB - too contrasty, more blown highlights and no colour benefit. I have no idea what image problem Sigmoid is designed to fix or optimise or why it replaces Filmic RGB - if it does.

I’d also love to have Sony lossless compressed recognised by DT but I guess that’s another thread!

Welcome to Pixls :slightly_smiling_face:
I don’t really know about the default Sony colors - dt treats Sony the same as every thing else, so I guess there might be a difference on Sony’s end…(always convenient to push a problem elsewhere :wink:). Seriously though, I’m not sure what could be done. I mean that literally, not saying it couldn’t be fixed, just that I personally don’t know how.

I personally don’t find it hard to get a match that I’m happy with, for these new Sonys, but I do have a fair bit of dt experience I suppose, plus maybe my taste is different.

It’s more about workflow really - it’s doing the same job as filmic with a different approach and algorithm. I like it, and find it works very well as do quite a few others, but I also know that many people don’t like/ don’t see any advantage, so it must be very much down to one’s personal taste and workflow. I used it in all the versions I posted above… for better or worse!

Check out the user manual, we are clear about the workflow, including applying this module.

I would not call colors “as you remember them” to be the same as “accurate.” People’s color memory are horrible. If you want accurate color, you can use a color checker to calibrate your images. Darktable supports this in the color calibration module.

Note that since the white balance is misread, its hard for the rest of the modules to get color correct. Once the wb issue is fixed, it should be fairly trivial to get a good white balance and a nice edit.

1 Like

Several people have problems with Sony and the modern chromatic adaptation. If I remember correctly, the reference white balance values for Sony aware wrong.

This seems like theoretic solution that only works for certain types of photography where conditions are under the control of the photographer — a type of photography that I believe is popular among geeks like us. The true solution is support for dual-illuminant DCP profiles (that RawTherapee supports). I really wish darktable had this so we could borrow some profiles from that converter app from the big A. :slightly_smiling_face:

More importantly, I’d like to have colors that are consistent with cameras that I do no longer own (and hence can’t calibrate using the color calibration module).

This seems to be it Istvan, It might be this thread or another but the poster did the exercise of shooting the screen and coming up with new D65 coefficients and these were lower for red and increased blue… when I used these values on the provided image the modern was an exact or very close to exact match to legacy by default it was not… The manual does state that the D65 values need to be accurate for the modern wb workflow so this is just a nuance that maybe needs to be reinforced or made more obvious if possible to new users.

I’ve learned a huge amount from this thread and still have a lot of information to digest, as well as experiments to run now that I have a spydercheckr in hand and a Calibrite ColorChecker Display on the way. My hope is that it’s possible to create a style that “just works” with the A7IV, at least for when the camera correctly detects the white balance. My fallback hope would be that it’s at least possible to create such a style that works for each lens that I own. I will definitely report back once I’ve reached some conclusions.

In the meantime, one remark and one question…

One remark, based on examining the XMP files people provided (possibly useful for @nugecom): Make sure you enable lens correction when working on color. Even supposedly super-high-end lenses that “don’t require much correction” like the Sony 35mm GMaster that don’t seem to have particularly bad vignetting do actually seem to improve color when I enable lens correction before color correction. I wonder why lens correction isn’t on by default, and am tempted to enable it by default, particularly now that there’s an “as reported by lens” option that doesn’t even require lensfun support

Now my question: What, if anything, can I conclude from the custom white balance settings in my camera, which allow me to point the camera at light source and then capture a color temperature for white balance from the center of the picture? Specifically, I’ve noticed the following:

  • The color temperatures reported by the camera are much warmer than what I set my light sources to, and vary a lot by lens. For example, when pointed at a “5000K” light source, with my 35MM GMaster I might get 4700K, while with my 85MM Sigma DG DN I would get 4400K.

  • The color temperatures reported by the camera, even if not in agreement with my light sources, do seem consistent across light sources. For example, if I point it at an ElGato Keylight or a Luxli Cello (at the same color temperature) with a diffuser, or a gray square on my SpyderCheckr illuminated by my Luxli Cello, I get readings +/- 100K. (Readings on a Neewer 660 video light with diffuser are a bit farther off, but the Neewer is lower end product, so I never expected it to be color accurate.)

One possible conclusion is that my camera is correct and my light sources are not. After all, the Sony A7IV is a higher-end product than these cheap lights. On the other hand, the fact that the lights agree with each other, and that the lens makes such a difference, suggests that maybe what I’m seeing is the difference from the lenses, and so the camera and light sources could both be correct. Is there a way to use my ColorChecker Display (once I receive it) to check the color of my video lights?

I guess the next question is how to compensate for lens color when trying to do the color calibration. I’m assuming the lens correction does not handle color cast? But is this something the camera might compensate for in its own white balance reporting, so that an “as shot by camera” white balance in color correction could still work?

1 Like

That should be entirely possible, and in fact some of the edits in this thread might be close to doing that, but would need some tweaking over a bunch of images to get closer. That would be my (possibly un-scientific) approach anyway…

I don’t think I have any good answers to your other questions, and to be honest I feel like it might be getting into serious color science territory somewhere around here,
which is outside my knowledge entirely. :flushed:.

But it’s all very good reading, and I’m following the thread with interest. :slightly_smiling_face:
I hope others can come up with a few answers.
Keep at it!:+1:

Really, the best way to get a good white balance assertion is to measure it in the scene before you leave. If you’re fortunate to have a camera that’ll do it, you can take a white balance measurement off a neutral target patch and store it in one of the preset slots; my Z 6 does that. If not, taking a picture of a target with a neutral patch in the scene’s lighting will let you adjust it in post with the raw processor’s patch mode for white balance.

This method will also compensate for lens colorations, as the camera is capturing the measured light through the lens. I don’t think lenses are too much of an influence these days; I get color profiles with good deltaE from Colorchecker reference spectra, measuring spectrum through the lens…

So the good news is that the A7IV definitely lets you do that. The even better news is that it seems to give the same result regardless of whether I point the camera at the light source or a neutral patch. In fact, pointing the camera directly at the light source returns a result closer to the gray patch result than even using an expodisc! So this means I don’t even really need to walk around with a gray card or an expodisc.

Now the two pieces of bad news are that:

  1. Even when the camera nails the white balance and the JPEG looks great, it still takes quite a bit of fiddling for me to make the picture look good in darktable. Moreover, I suspect that even when I get the picture to look right, I haven’t properly separated the color manipulation into an objective correction step and a subjective creative step. Since I’d like to have creative styles that are similar to lightroom presets, and I want my chroma and lightness adjustments really to adjust chroma and lightness, I need to nail the objective color correction part for my styles to work the same across different photos.

  2. I’m often taking street photos at night, when there isn’t just one predominant light source and I don’t necessarily have time to photograph my spydercheckr. By the time I get out my spydercheckr, one of my light sources might have driven away, or the illuminated billboards or traffic lights may have changed.

The good news is that my camera’s automatic white balance seems to get it right the vast majority of the time (based on the JPEGs), so it’s clearly possible to do something reasonable. I just need to understand a bit more color theory to see what is going on.

What I understand so far is that light can be described objectively in terms of brightness and saturation, and this is approximately how our brains process things that emit light (modulo second order phenomena such as the Hunt effect). However, for things that aren’t light sources (i.e., most objects), our brains apply “chromatic adaptation” in which we try to figure out how light and colorful something is independent of how it is currently being illuminated–even though of course the lighting changes what photons are actually bouncing off the object. So this is measured in lightness and chroma, and depends on the white balance of lighting conditions. E.g., a white piece of paper will look white to us under the sun or under an incandescent bulb, meaning it has a zero chroma regardless of the saturation of the light it is currently reflecting.

In order for our brains to view objects in terms of lightness and chroma, they need to do their own biological equivalent of auto white balance, based on cues from the environment. When that fails, as infamously happened with “the dress”–a photo of a black and blue dress that half the Internet saw as white and gold–we can perceive photos as completely different from what was actually photographed.

Even though humans usually infer “white balance” from our perception the lighting and surroundings, our brains act differently when looking at a computer screen. Somehow our brains figure out the white point of the monitor (often 6502K for a D65 monitor) and view everything with respect to that white point regardless of the lighting or appearance of other objects in the room. This means, for example, that you need to translate the objective saturation of a color between different lighting conditions. The color emitted from your screen (with a D65 white point) might need to be translated to D50 to print on a piece of paper while still appearing to have the same chroma. If you printed a perfectly calibrated colored square that seemed to match the exact color of a square on your screen, but then looked at the center each square in turn through a tiny pinhole that didn’t reveal any surroundings, you would think the colors looked different from each other through the pinhole.

So now we come to the video I mentioned above, that probably contains the answer to nailing the colors in all my photographs, but that I don’t understand yet. The video suggests first setting the white balance module, then setting the color calibration module as follows (even though setting both white balance and color calibration generates a warning):

First, set the white balance module according to the procedure described at the end of the color calibration module description in the darktable manual:

The ability to use standard CIE illuminants and CCT-based interfaces to define the illuminant color depends on sound default values for the standard matrix in the input color profile module as well as reasonable RGB coefficients in the white balance module.

Some cameras, most notably those from Olympus and Sony, have unexpected white balance coefficients that will always make the detected CCT invalid even for legitimate daylight scene illuminants. This error most likely comes from issues with the standard input matrix, which is taken from the Adobe DNG Converter.

It is possible to alleviate this issue, if you have a computer screen calibrated for a D65 illuminant, using the following process:

  1. Display a white surface on your screen, for example by opening a blank canvas in any photo editing software you like
  2. Take a blurry (out of focus) picture of that surface with your camera, ensuring that you don’t have any “parasite” light in the frame, you have no clipping, and are using an aperture between f/5.6 and f/8,
  3. Open the picture in darktable and extract the white balance by using the spot tool in the white balance module on the center area of the image (non-central regions might be subject to chromatic aberrations). This will generate a set of 3 RGB coefficients.
  4. Save a preset for the white balance module with these coefficients and auto-apply it to any color RAW image created by the same camera.

Second, apply this white balance preset when using a color checker chart in color calibration, as also described in the manual. The color checker chart can be photographed under D50 to D65 lighting conditions for this procedure, producing a “generic profile that will be suitable for any daylight illuminant with only a slight adjustment of the white balance.” Save this as a preset, too, and then apply both the white balance and color calibration presets to your photos, ignoring the warning.

So the first thing I don’t fully understand is what the white balance module is really doing here. Since color calibration runs after white balance, I guess it should be able to undo whatever you have done in the white balance module. So then why does it matter at all? I guess because of the mysterious input color profile that flips out on “unexpected white balance coefficients,” but then I don’t understand what input color profile does or why color calibration can’t fix it either way.

The second thing I don’t understand is how to make use of my camera’s own pretty good determination of white balance. Is there a way to feed it into this flow? For many photos I’d like to be able to set the color calibration illuminant to “as shot by camera”–but will that work? In particular, will setting the illuminant to “as shot by camera” completely override the color calibration preset I made with the color checker, or will it be layered on top of that color calibration? If layered, why would this work–isn’t the white balance recorded by the camera something that is supposed to be applied directly to the color in the camera, not to the colors after they have been already translated to D65 lighting and then calibated with a color-checker-derived matrix? In particular the camera’s white balance doesn’t know whether I shot my color checker at D50 or D65 or somewhere in between, so how can this possibly work?

Similarly, would it work to use the dropper in color calibration to set white balance based on a white square in my photo after applying the colorchecker-based color calibration preset?

I’m still so confused by all of this. I did find a web site that explains a lot about color, but it’s a lot to digest and figure out how to apply to darktable…

@Frisco : have you tried disabling color calibration, and switching white balance to simply as shot, instead of camera reference? I think many have reported issues with wrong reference values for Sony cameras, which then breaks colours. With the legacy way, you don’t have such problems. (Sorry if this is not new; I did not read the whole thread, as it’s quite long.)

1 Like

Going to show you what white balance multipliers are for, in any software…

First, let’s look at a normal raw development. Showing you this in rawproc, my hack software, because the processing pipeline is laid out for you to see and I can arbitrarily turn on/off any operation:

Rather normal color and tone, no? Okay, take a look at the upper left pane, this is where the toolchain is listed and managed. These operations are applied to the raw data in order from top to bottom. Note that the last tool, the tone curve is checked, that’s the tool whose results are displayed in the main pane. The selected tool, the one with the white background, has its parameters displayed in the bottom left pane. I’ve selected white balance for this because I’m now going to turn it off by unchecking the box in the top left of the parameters pane:

Same processing, just without white balance. THAT is what you’re correcting with those multipliers. Note the histograms of the two screenshots; the first one’s right-hand peaks are pretty well aligned, and the second one’s are not. This works for most but not all images: adjusting the white balance multipliers so those peaks line up is one way to get a decent white balance.

I picked my Nikon D7000 for this illustration, because the effect is quite apparent. Each camera has their own characteristic spectral response, and it’s that “un-evenness” that needs correction. Here’s a plot of the D7000 spectral response:

nikon_d7000_ssf

Note that the so-called “red” channel is deficient in energy relative to the others; that’s why the white balance RGB multiplers for this image, 2.097656,1.000000,1.316406, has such a heavy hand in the first number.

You asked… :laughing:

1 Like

Quoting myself from earlier:

Notice that WB comes before demosaic and CC comes after. They essentially have very different data to work with, which changes what they can do and the assumptions they can make.

Unless you have some specific issues or needs, that’s probably all you need to know about the WB module. In other words, leave it at the recommended settings and don’t worry any more about it.

Don’t remember which one, but Aurélien explains at least some of it in one of his videos.

And, if you don’t know, here’s what a RAW file looks like before being demosaiced:

Thanks. This makes concrete something that I already sort of understood, that the red, green, and blue sensors need to be normalized, since there’s no reason that they should have the same sensitivity. But what I still don’t understand is:

  • Why normalize them to D65, instead of, say, D50, or even the lighting conditions under which you took the particular photo you are editing?
  • Why doesn’t the camera’s reported white balance already include these coefficients? Is there an unstated standard that the camera’s white balance is reported relative to D65 white?
  • Assuming D65 is arbitrary and the camera’s white balance is not relative to D65, wouldn’t normalizing everything to D65 essentially invalidate the camera’s white balance estimation (which I would like to use in many cases)?
  • In dark table, why isn’t the calibration of coefficients done by the color calibration module matrix? In particular, why isn’t the warning about whitebalancing twice a legitimate warning that I’m doing something that will produce incorrect results?

DT is simply set to fire an error if you modify the D65 values it provides. When you adjust those to give a more accurate or appropriate set of coefficients to the CC module you will get a better result but you have the error simply because DT has noticed you have modified it and so it thinks you have done a double white balance for lack of a better term so don’t worry about it.

From the manual…

Chromatic adaptation is controlled within the Chromatic Adaptation Transformation (CAT) tab of the color calibration module. When used in this way the white balance module is still required as it needs to perform a basic white balance operation (connected to the input color profile values). This technical white balancing (“camera reference” mode) is a flat setting that makes grays lit by a standard D65 illuminant look achromatic, and makes the demosaicing process more accurate, but does not perform any perceptual adaptation according to the scene. The actual chromatic adaptation is then performed by the color calibration module, on top of those corrections performed by the white balance and input color profile modules. The use of custom matrices in the input color profile module is therefore discouraged. Additionally, the RGB coefficients in the white balance module need to be accurate in order for this module to work in a predictable way.

EDIT

Taken from the link I provided earlier… i think and I am not expert but basically DT is taking this approach as being the best way to handle this aspect of digital photography…

This is going to be a bit confusing, just bear with me…

White Balance in post-processing isn’t at all about color temperature. Most software gives a temperature interface, but that has to be translated to multipliers to be applied, and the translation is not very accurate. You left the real color temperature at the scene; once the camera encodes the spectrum into those paltry RGB triples, color temperature becomes a coarse approximation.

Color temperature is used to construct the camera profile, and thus becomes a factor in the down-transform of the raw in camera space to the working colorspace or rendition colorspace. This transform is about gamut, not white balance. D65 was established as the default by dcraw, where Dave Coffin includes a humungous table of camera color primaries that are calculated as D65 primaries:

https://github.com/ncruces/dcraw/blob/master/dcraw.c#L7114

Adobe has a rather convoluted scheme centered around “dual-illuminant” profiles, where they calculate a set of primaries interpolated to a color temperature between D65 and (usually) StdA, but really, using D65 for most images is not egregious.

How-how-ever, if one is clever, the white balance correction can be incorporated into the gamut transform. I’ve done that by not white-balancing my colorchecker target shot, and holy cow, I get nice colors without any use of the white balance multipliers. I think the recent darktable white balance tool does something similar, which is why the whole business of color temperature seems to be wrapped in white balancing. It’s definitely a better way of doing white balance, rather than slewing the channels all around with the multipliers.