are custom color matrices still relevant?

I noticed that only a small fraction of the supported cameras have a custom matrix, and if I am reading the code correctly the latest nontrivial changes to the codebase happened 7 years ago.

Are custom color matrices irrelevant for contemporary cameras? Or do people just create a preset in color calibration with a color checker and be done with it?


An arbitrary generated custom color matrix might not be the best solution for your camera since you don’t know if the matrix was calculated based on proper circumstances. So unless you’re not generating custom color matrices for each illumination situation yourself using a color target and color calibration is the ways easier aproach.

1 Like

Many of the color matrices are not custom made by users or developers, but are borrowed from the DCP’s provided by Adobe DNG Convertor. These are essential for good color rendition and are very sufficient in most cases.

many of the cameras that have a custom matrix are actually just rebrandings of the same.

mostly these were done by pascal de brujin, who was unsatisfied with the tradoffs taken by adobe/canon. in the days, especially the reds were often very weak and so his matrices were a very good improvement. he also took care of the quality control here, so “custom” doesn’t mean these were just a bunch of numbers we got from a random dude on the internet.

i don’t think these would do much over a dcp, also a haven’t seen quite so bad matrices any more. maybe haven’t been looking so much, or maybe the process has improved over the years.


I don’t think these matrices were “arbitrarily generated”. Maybe the users who contributed them didn’t have top-notch lab equipment and made some mistakes, so there could be a small measurement noise, but nevertheless they were based on data.

I would rephrase my question like this:

  1. is a linear transformation sufficient to capture this correction reasonably well for general circumstances (for a specific sensor), or do you need a specific correction based on a particular photo taken with that sensor? My understanding is that modern sensors are linear (as recorded in the RAW file), so while the quality of light etc may matter for calibrating these values, once they are estimated they work generally.

  2. if the answer is yes, did people stop contributing these because they require an investment of time and money (color chart), or because another tool superseded this?

It’s a rather funny business, camera profiling…

The goal of it is to cobble together a set of input conditions that allow the transform of the rich camera data to the oh-so-limited color capabilities of rendition media - your display, the ink/paper in your printer, etc. That transform is an application of some sort of math that grabs a beautiful camera color and plops it inside the closet-sized gamut boundary of the destination. In doing that something is lost: hue fidelity, saturation gradation, or both, depending on the chosen math gonkulator.

All of those gonkulators are anchored at the destination end of the transform journey by a white point. If your raw processor is working with a single matrix for each camera, that matrix is probably anchored at D65, nice bright sunlight in clear blue sky, thank you Dave Coffin. Great, if all your photography is in daylight, but what happens when you move inside and are saddled with somewhere-around StdA lighting. Well, such a raw processor is going to use the D65 matrix and expect you to futz with white balance to make-nice. Actually, this works okay for a lot of uses; tell me, did you ever notice the difference?

So, a custom matrix for various illuminants is worth considering. Adobe has tried to capture that fidelity in dual-illuminant DCPs; those contain two ColorMatrix-es, whose sole purpose is to provide the endpoints for interpolation of a color temperature of the scene. Which is then used to interpolate a matrix between the two corresponding ForwardMatrix-es. Geesh…

A careful studio photographer will do something like 1) choose lighting devices with consistent color temperature, and 2) shoot a ColorChecker under that lighting and make a profile specific to it. So, pretty relevant there…

To my myopic thinking, a camera profile should faithfully characterize the colorimetric-ness of a camera as the starting point for further “damage” to an image. But there’s a school of practice that likes to make camera profiles that depart from that colorimetric anchor to inflict a “look” on the image. If you’ve graduated from that school, custom color matrices are very relevant. Me, I’d rather save such damage for infliction later in the pipeline, and not confuse the fundamental intent of the camera profile.

Wow, now that I’m retired, I find I can write drivel all day… :crazy_face:


I don’t think anyone has produced such a camera, one that meets Luther-Ives. Matrix or LUT, getting camera data into the tristimulus-based color structure is a compromise.

1 Like

That goes to the whole point of general-purpose photography, making pleasing images to support various communication. That they’re not colorimetrically faithful is just a passing note…


Don’t forget the issue with matrices in general:

This is why many Adobe DCPs and, by default, dcamprof DCPs, actually have a matrix that effectively “desaturates” the colors (avoiding the problem of negative Y values in the process), and then performs a 2.5D LUT in the ProPhoto HSV space to resaturate the color information

As to what a “2.5D” LUT is - it assume that a camera’s response to changes in amplitude is linear, so does not take amplitude (V) as an input to the LUT (because the camera’s response should not be affected by input amplitude) - but can shift V based on H and S. (The rough equivalent for those familiar with DSP algorithms is an FIR filter - linear amplitude response but NOT a flat frequency response).

As to colorimetric accuracy - that is indeed the goal of reproduction profiles. (Repro profiles are meant for accurate reproduction of a characterized object, and NOT intended to be visually pleasing)


So, accepting that perfect reproduction of color is not possible and everything is an approximation, what do you recommend in practice in Darktable 4.2 and later, other than calibrating every single light/camera combo?

Eg should I

  1. take a color chart, a D65 illuminant, calibrate in color calibration as bypass and put it after white balance?
  2. do this, but for various lighting situations that I expect to occur?
  3. something else?

I would like recommendations on this, too, before I spend time redoing all the custom calibrations I lost when updating to 4.2

1 Like

After doing all manner of camera profiling, from fighting target shot glare, to making optical equipment to measure my camera’s spectral sensitivity, here’s where I’m at:

For casual photography, the raw-processor-supplied matrices (usually D65) work just fine. If I need to, i adjust the “as-shot” white balance with a variety of methods, I find simple “gray-world” automatic WB works to get close, from there I scooch it around to taste.

When I shoot a subject with extreme hues, I’ll dig out my spectral profile. But, I’m not so sure I couldn’t get decent gradations with the new color tools. Already noticing such a difference with vkdt, such might reduce my reliance on my oh-so-hard-to-come-by profiles made from measuring the camera’s spectral sensitivity with a spectroscope.

If i did studio work, I’d probably make a good target-shot profile in that light. Although, having spectral data for the camera eliminates the need for an illuminant-specific target shot; from that data one can make a profile to any color temperature.