Colors out of gamut with no extra processing

Hey all -

This has been confusing me for a while. I’ve got a night photo with a lot of contrast and saturated colors for which I’ve never been able to clear the gamut checker, even immediately after importing with no extra modules added (default processing workflow set to none). Here is my thinking, let me know if I’m missing anything:

  • Shooting:
    • Nikon D7200, Auto WB (as-shot ended up at 3186K), Adobe RGB in-camera gamut
    • loading raw NEF directly into DT, though this also happens when I use Nikon’s NX Studio to directly export a TIFF with embedded Adobe RGB profile
  • Using DT 4.6.1, Windows 11 v10.0, build 26120
  • When I use the default input matrix, rec2020 working profile, rec2020 soft proof profile, and rec2020 output profile (snapshot of this case at end),
    • I would expect there to be no gamut issues (but there is) because:
      • Adobe RGB is fully contained within Rec 2020’s gamut, far as I can tell from available 2-D plots (maybe this isn’t the case in the full 3-D gamut?)
      • There are no color space changes in the pipeline, other than at the input
    • Using highlight reconstruction doesn’t have an effect either way
    • Same thing happens with everything ProPhoto, though it is slightly less bad

Any thoughts on why this might be?

If you can post the original NEF file, people can take a look. Meanwhile, I can guess.

Adobe RGB in-camera gamut

As far as I know, this affects the OOC JPEG, but not the raw NEF file.

I suspect that some pixels are not only outside AdobeRGB, and Rec2020, but even outside the CIE chromaticity diagram “horseshoe” (aka “tongue”). So they are, apparently, not visible colours.

Of course, the photographed scene contains only visible colours. The problem is that the input matrix is a compromise, designed for “typical” colours, not the super-saturated colours in this scene. It isn’t accurate for these colours.

The fix is to push the pixels back into the gamut of your chosen colorspace. I don’t know darktable well enough to advise on that.

Including moi. I would open it in RawDigger and look at the raw histogram … could be a clue or two there …

Here ya go!

_DSC0017.NEF (24.8 MB)

The chromaticity diagram comes out like this:

sxs_xyy

So there are blue pixels that are apparently outside the horseshoe, hence not observable.

And opening the raw in RawDigger and saving as RGB Tiff …

much the same thing in ColorThink 2D xyY …

Lots and lots of gamut-clipping in sRGB space, let alone Rec.2020. If someone edited to taste in a wide gamut, then saved as sRGB for the web, that would do it.

I wonder if it would ever be possible or desirable to have a raw histogram option incorporated into DT? I for one would find this interesting.

1 Like

You can also use “RGB Primaries” module in Darktable to reel those primaries into your desired gamut, by reducing purity of problematic primaries.

Don’t worry about the gamut check to much, unless you see loss of detail or colour shifts that bother you. As far as I know, in darktable’s pipeline, even out-of-gamut colours (those with negative components) are usually handled well. There are a few places where colours are compressed/gamut mapped (for example, filmic, color balance rgb, color calibration), and one can also apply hard clipping in color calibration, if needed.

Oh interesting! Thanks everyone this is super helpful!

@snibgo, what did you use to make that plot, out of curiosity? @Terry It would be so cool to see pixels plotted relative to the bounds of various color spaces in DT! More or less like the CIE plots shown here. I’ve been wanting to see something like that in order to wrap my head around it and was thinking about breaking out python or matlab/octave to do it haha.

In the interest of really, actually understanding this, let me try to talk through it from the ground up; please correct me where needed. Also, apologies if this is too deep down the rabbit hole, I’m just very interested to learn :slight_smile:

— Deep Dive —
Ok so first, each pixel collects light within a given band (approximately Red, Green, or Blue) and outputs a number from (for the sake of argument) 0 to 1, proportional to the amount of light it collected. Any combination of pixel values at this stage have real physical meaning and so can’t really be outside of the CIE 1931 color space (the horseshoe, as @snibgo put it), even though they aren’t in a well-defined CIE color space yet. Those pixels are then more or less interpolated (demosaiced) in one of a handful of ways, generating single pixels with 3 color channels each (values also between 0 and 1). At this point, there are a number of artifacts that can creep in (aliasing, zippering, etc.), but these are still values that correspond directly to the amount of light received in a certain band, so there can’t really be a combination that is non-physical. The trick, however, is that these RGB values refer to the total amount of light within a certain band, as opposed to what combination of a few very specific primary colors would be required to create the impression of the original color the sensor was exposed to. This is what the input color matrix is doing, but it is not a trivial task because our cones have quite a lot of overlap so it cannot just be a direct mapping.

Now, we would want to map this directly to an independent tri-stimulus space like XYZ or LMS, and this is where imaginary colors can get generated. Ideally, this mapping would be contained within CIE 1931 but any map between these two spaces would have to be based on real-world experiments and so probably requires some interpolation. If that is done by an algorithm rather than a simple LUT, I would guess it’d be optimized for more “common” colors and likely has some distortions at the extremes, which could accidentally bleed into imaginary colors?

— TL;DR —
The implication here is that the fix for the imaginary color problem is to get a better input color matrix that is optimized for more saturated colors (and for my specific camera, for that matter). Would calibrating with a color card do the trick you think?

As for just being outside of any usable color space like rec2020, it seems like it is just normal for a camera’s gamut to be way wider than most useable color spaces so I just need to artificially tweak the colors to get what I want (as @AD4K said), or else just live with the clipping DT does automatically.

— end —

@kofa: yeah, I guess I’ve been worried about it for two reasons: 1) I’m just curious and would like to understand what’s actually happening better and 2) I’m planning to get a print made of some version of this and I’d like to know where the limits of the printer’s profile is so I can get the most vibrance out of it I can. It would be really nice to be able to separate where in the pipeline things are being clipped for both debugging reasons and also in order to get a better handle on #2. I think being able to plot pixels against the color space in question (as @Terry was suggesting) would be super useful for that as well.

I think if your sensor is somewhat sensitive in the range outside of what the ‘horseshoe’ represents (very low red, near infrared and the same situation at the other end of the spectrum), you could have signal corresponding to non-/hardly visible radiation, which can only be represented by ‘imaginary colours’.

Yeah I was thinking that too, but the camera/computer doesn’t have any information about what wavelengths were stimulating, e.g., the red pixels; it only knows how much those pixels were stimulated. It is certainly possible that the input color matrix is somehow able to distinguish this (maybe when the red pixel is really truly the only thing being stimulated it assumes the color is near-infrared, but that could just as easily be a nearly pure tone in the normal range, e.g. from a screen), but I feel like the default color matrix wouldn’t be that extreme? Idk that’s speculation on my part though.

The raw exposure indicator shows some channels blown so you have missing data to start with. You can look at the vector scope as a guide for saturation and color gamut excursions… it will show them based on what you set for the histogram color space…you also need to do a check on your chosen display profile as it can clip and mask gamut excursions. Your displayed image will change don’t worry but if shifting your display profile between linear rec2020 and your current one changes your gamut excursions then likely your display profile is clipping …there is no problem for unbounded profiles but many can introduce a false reading due to clipping because it comes before the histogram profile processing in the pipeline…a bit of a glitch for certain profiles

Interesting. If I put working/soft-proof profiles on ProPhoto and swap between system display and ProPhoto, I don’t see much difference in clipping. I’m pretty surprised though since, I mean my display is decent, but I didn’t think it was that good haha. When I change the display to sRGB though, it does change things, so it is doing something at least.

Didn’t know it comes before the histogram! You’d think that would at least be switchable, since I’d want to know what’s going into the final file I’m exporting, regardless of what display I’m viewing it on…

Unfortunately, the vector scope/histogram doesn’t work for the particular printer’s profile I want to use (because that’s just my luck haha). I get a popup that says “unsupported vector scope profile. it will be replaced by linear rec2020”, but the display goes blank.

Sorry, a bit over-simplified, I reckon.

For example, some cameras collect a little IR in the red channel and that really is “outside of the CIE 1931 color space”.

As you know, IR is not “light”.

Fair point. Maybe a more precise way to put it is: they are realizable within CIE 1931, since they cannot have been created by a source outside the curve of pure frequencies, as would be the case for imaginary colors. However, because information is lost by binning large groups of wavelengths into just three channels, it is possible for some sets of pixel values to be metamers with spectrums that do contain wavelengths outside of CIE 1931. It could be interesting to find out if some “colors” in, say, IR are distinguishable from other normal colors, maybe through leakage to neighboring pixels or something, but that sorta falls into the realm of defining the input color matrix I guess.

This flower is a good image to demonstrate if your profile might clip and hide some out of gamut…

Its close if not out of gamut so a small tweak can send it right over the top… then you can try your linear profile / display profile swap and see how much change you see any marked gamut changes… I found that my calibrated profile did indeed do some clipping… it was a shaper/matrix profile that I made using displaycal and my x-rite screen calibrator

Well, when I opened this it seemed to have bugged out because I just get a mess of white pixels. When I go all the way back to white balance in the history it gives me just purple (and the white balance module sliders are set to extremes). However, if I reset the history, everything is fine and I don’t see any gamut clipping with the system display profile, so I think we’re good there.

Duplicated and reset sidecar attached.

2024-06-02-6885_01.ARW.xmp (6.0 KB)

Oh if anyone has any recommendations about debugging the problem with the histogram, here is the offending icc profile.

BayPhoto_MetalPrints.icc (9.1 MB)

That is a printer profile. Can dt open it as RGB and provide a meaningful histogram ?

Here’s the CMYK curves: