Color spaces nightmares : gamut clipping, WTF ?

This post is a collective answer to https://discuss.pixls.us/t/color-calibration-test-and-some-thoughts/ and some others on the same topic.

What is a color space ?

https://darktable-org.github.io/dtdocs/special-topics/color-management/color-spaces/

TL;DR : a color space is a shortcut to represent a light spectrum as a linear combination of 3 reference lights : the primaries. That representation has been chosen because it’s what the 3 types of cone cells of the retina do as well. It means that we can create a color space whose primaries match the perceptual ones : the LMS spaces, used for chromatic adaptation.

But all RGB spaces are not created equal, and their primaries, once seen by a human observer, may look more or less saturated. Some primaries may even allow to create combinations that don’t actually exist in human vision : we call them imaginary colors, and they are tricky because they use valid code values.


This graph shows the visible locus (the colored horseshoe) and the triangle overlay is the locus of the Prophoto RGB space (each corner of the triangle is a primary). Where the triangle escapes the horseshoe is where the colorspace can encode imaginary colors with perfectly valid RGB values.

From the triangular geometry of the color space gamut, you can deduce that no RGB colorspace will ever be able to perfectly match the visible locus : you either allow imaginary colors in or discard visible colors out. Like for Rec2020 :

For reference, the sRGB gamut :

And one camera RGB gamut with standard color profile (from Adobe DNG converter), later chroma-adapted to D50 using various methods:

image

The largest is not the bestest

There is this belief that the largest space is the best. Well, it depends the best for what. For archival purposes (that is, saving end products or sharing renderings between apps), maybe. But for any kind of color grading (that is, artistic and corrective color modifications), any space that allows imaginary colors might create problems by allowing users to push colors out of the gamut. Which is a bit sad if your original colors were in fact in gamut, and will also make gamut mapping at export time much more challenging.

Also, the primaries matter for perceptual consistency. For example, here is a hue gradient derivated from sRGB through HSL :
Screenshot_20201228_145135

And a hue gradient derivated from a special RGB space, designed for color-grading, through Yuv :
Screenshot_20201228_145050

See how secondary colors (yellow and magenta) get almost no range in sRGB while green sucks more than 1/4th of the range for itself ? Also, see how blue and red looks actually much more saturated than the rest of the gradient, in sRGB, while yellow feels brighter ? In the color-grading RGB, the saturation is more even and each of the 3 primary and 3 secondary colors get almost 1/6th of the range, which is our goal to get a perceptually-even space. Grading in a such space will behave much more uniformly (notice in both cases, the actual gradient is interpolated in linear Rec 709, only the color steps are defined from HSL or Yuv).

To close on the largest space sausage party, remember the vast majority of reflective colors (meaning material colored surfaces reflecting light produced by something else) lies well within sRGB. The very-high-saturation colors are usually colored light sources, that is, directly emitted light filtered by some filter or even lasers. That’s why HDR goes with large gamut, because while SDR aimed at representing reflective materials (the bare minimum to do photography), now HDR tries to represent light sources too.

Gamut clipping is (kind of) a myth

Any properly managed application converts from color space to color space through a Color Management System (CMS), that will perform gamut mapping using various strategies : the intents. When converting to a shorter RGB space, one will have to make concessions on the color accuracy to fit in the destination space. The least bad concession, among the computationally reasonable methods, aka the perceptual intent, is to decrease the chroma at same hue and luminance, meaning we compress all colors toward the white point. This will preserve the smoothness of gradients and avoid creating solid color blocks where progressive transitions are expected, because they would look fake and won’t match the rest of the image.

Since gamut is actually a volume (so the above graphs are projections over a plane), here is what it looks like in 3D :

image

So what we do when gamut-mapping is pushing colors inside the double cone, toward the vertical grey axis, while staying on an horizontal plane (same luminance) and a vertical plane (same hue). The way the darktable color calibration gamut compression does it is all colors gets pushed by a step, but the step is larger as the colors are further away from achromatic. That allows us to preserve the saturation gradients while barely affecting the low-chroma colors, that were valid in the first place.

But… most of you are using sRGB-able screens, and few have Adobe RGB-able ones. In any case, they are both much smaller than the camera space. Which means whatever you see on your screen is already the product of some gamut shrinking. If that looks good, there is no reason to get alarmed about gamut at all, since everything seems to hold on nicely. So the gamut-clipping anxiety is mostly a geeky concern from people who understand some of the problem but don’t really know about the solutions that are already built-in to solve it.

2 kinds of color spaces

We have the color spaces tied to a medium, that represent the range of colors physically rendered by the medium. That kind of spaces are necessarily bounded, meaning the white luminance is defined by the medium itself (backlighting power for a screen, or whiteness and gloss of the paper), so no RGB value can exceed the display peak luminance.

But the black luminance is also defined by the medium (remanent brightness of the LED panel or ink density for paper), and that luminance is never zero. So RGB values should not be below that black threshold either, which is handled by black point compensation in some properly managed apps, but is less known and randomly implemented.

Any RGB value below medium black luminance and without black point adaptation will be part of a solid black blob on the print. Notice that black point adaptation raises the whole luminance range uniformly, so people who complain about darker-than-screen prints should blame the lack of black point adaptation in their printer driver first.

Then, we have the reference color spaces : Rec 2020, Adobe RGB, ProPhoto RGB. These are not tied to a particular medium and can be used unbounded. They are only data encoding. Black will usually be encoded at 0 but be careful about that, because that holds no information on the original luminance of “what we call black” on the original scene, and it is to be taken as “the darkest value recorded” to which the retoucher will arbitrarily assign a luminance (as in filmic “black relative exposure”) until it looks good®.

Your usual raw processing pipeline goes from camera space (medium-tied, super large) to working space (reference, large but smaller than camera’s), to output/export space (reference or medium, as large as needed for reference, but usualy super small for media). Every conversion needs to gamut-map properly, but the assumptions and the methods are a bit different if we convert a final product to medium space or an intermediate working material to reference space.

Bottom line, gamut clipping exists as a thing, but will be avoided 90% of the time if you are using a serious application, through gamut mapping. Which means, as an user, you don’t need to have nightmares about it. The only problem that may arise is that colors far away from gamut will be handled in sub-optimal ways by the gamut mapping, because they are pushing it too far. But such cases should pop up in your face at editing time.

What does “out of gamut” mean anyway ?

From a medium space perspective, where white and black are bounded, out of gamut can either mean:

  • brighter than display peak-emission (white)
  • darker than display peak-density (black)
  • too high chroma at current luminance (luminance is valid, but color is too far from achromatic).

Problem being, most gamut alerts won’t make the difference between the 3, so you don’t know if you need to fix the exposure, the black level or the chroma/saturation, or any combination of the 3.

From a reference space perspective, out of gamut only means too high chroma, since we don’t have bounds.

Why gamut-map early ?

This will only nail what we just laid out: to ensure our working color space contains only visible colors. No UV, no IR, no imaginary colors. Because if you manually push saturation/chroma on imaginary colors, you will create problems for later, at export time. Also, we don’t know what saturated UV light looks like in real life, compared to desaturated UV, but we know what it looks like in pictures : flat blue blobs at display peak blue emission. Or worse : blue gradients that move toward cyan through indigo.

But… we don’t need to gamut-map at the beginning of the pipe using the export color space as our gamut volume reference. That’s way too conservative. Also, we will have to sacrifice too much the valid colors to salvage 100% of the image in sRGB. Which brings us to the next step…

Don’t turn into gamut-alert freak

Having 2% of your picture out of gamut (whatever gamut you compute against) is no concern at all. Having 10% of your picture out of gamut neither, as long as your sRGB control monitor shows decent gradients. (I’m just throwing made-up percentages here, please don’t take them literally).

Having more than 25% of your picture out of gamut, or some colors really far away from achromatic (very high chroma), or color banding/posterization or flat color blobs where there should be gradients is a problem.

But the truth is people have a hard time assessing the qualia of an image and will not spot the visual issues in the frame. So your average social-media sunset degrades from amber-orange to rat-piss yellow and nobody takes offense:

https://www.instagram.com/p/CJWG56VLyBi/ (Just a random example picked from the latest results to a #sunset query on Instagram)

While, in the meantime, people turn up the gamut compression like crazy, looking at gamut alerts that don’t say if it’s a luminance or a chroma clipping, to fix issues that don’t exist since your export CMS should nicely take care of 90% of issues.

Also, gamut alerts don’t show gamut clipping. They show out-of-gamut pixels, which may or may not get clipped at export, depending on how clever your app is at managing color.

In what space to show histograms ?

Any. It doesn’t matter. What we look for, when looking at histograms, is the spread. If you really want to see the middle grey in the middle of the graph, then choose some space that has a power 2.4 as OETF, like sRGB. Although, the sRGB OETF has a linear slope for low-lights, and a power above, so it kind of messes up the spread.

But the truth is, if you wonder in what space to show an histogram, you most likely don’t know how to read it, so don’t bother at all, hide it and look at the picture instead.

Especially on opensource forums, where the crowd tends to bee geeky by nature, people overthink scopes and numbers so much that they end up retouching numbers instead of photos.

Conclusion

Stay analytical, assess what you see and criticize it. But don’t overthing gamut issues, or make them up based on theoritical concerns you half-understand.

Ensure your apps are properly color-managed, meaning they do gamut mapping (perceptual or relative colorimetry) and black point compensation when converting to other spaces (BTW, darktable doesn’t black point).

Check your apps assumptions : some drivers have built-in LUTs to do the perceptual intent that expect input images saved as sRGB or Adobe RGB, so exporting directly from your raw editor to printer space may result in undefined states (please harass your printer/driver vendor to provide that kind of information and tell them that black boxes are not cool until they get it).

35 Likes

Thanks for this clear explanation. I’m learning A LOT every time you post something.

So true. We’ve been all in this position.

4 Likes

First off, thanks for that post! Much needed and very very helpful.

Two immediate things:

  1. Should one explain what an opponent color-space/representaion is and how rgb-primaries form the triangle in the CIE horseshoe, but chroma could technically lie outside of that triangle (whether this makes sense for display purposes or not would be gamut-clipping or gamut-remapping)? I have the feeling you left that out for clarity.
  2. this

Probably needs more discussion. Maybe not here though?! Because it

a) touches a spectral region which is for many color-models still iffy to do certain transform aspects in. This probably affects gamut-compression/remapping ‘performance’ or ‘quality’. And

b) it critically depends on what you call UV (380nm is well on the CIE-horseshoe, looks purple in real life as the horseshoe suggests and should definitely not be leaking out of a blue LED. Blue LEDs at the moment have at worst a 405nm center wavelength, not lower, usually they are very very bright narrowband 450nm lightsources. 380-450nm is exactly the spectral region which is…uhm…interestingly handled in many color-models). UV sources below 380nm tend to look very faint and pale blue to the eye, almost white and you should never ever look with your eye into it! I would be a bit surprised if 380nm LEDs are being sold at the moment…but their color should be represented as a purple hue and not nuclear-blue.
On ccd or cmos sensors, if the lens-glass is transmissible for such UV-wavelengths below 380nm (not every glass variant and not every anti-reflex coating is) it can have all sorts of contamination colors, not necessarily just blue. Just like IR-contamination will sometimes ‘only’ have weird colorshifts.

Also, I had to laugh at rat-piss-yellow! :grinning_face_with_smiling_eyes:

Thanks again Aurelien!
Cheers

For all we care, chroma is the distance between a color and the achromatic color (“white”) having the same luminance. The max chroma that a color space can handle depends on the hue, the luminance and the color space primaries. Bringing the opponent representation is a bit off-topic for something I wrote mostly to reassure users rather than to be a class about color management for devs.

Don’t forget that sensor RGB is decoded to XYZ using an input profile 3×3 matrix that is a least-square fit of low-saturation samples. As you see on the graph above for the Nikon D810, the fitted coeffs send values to the UV zone, whether the sensor recorded them there or not. Also, sensor metamerism doesn’t match human metamerism (since the spectrum → tristimulus split is not a bijection in both cases, so 2 spectra can produce the same tristimulus on some sensor and not on others). So, in any case, shit gonna happen in the blue-indigo region and, given the circumstances, the highest priority is to get smooth gradients that do not cross several hues rather than aiming at accurate color matching.

Given the first quote, what is the reasoning for darktable not applying black point compensation?

Also, while the gamut compression in Color Calibration is brilliant for smooth gradients in the master/working profile, are there any plans to add gamut compression for the output profile?

Amazing work, once again.

Hello Aurélien,

Thanks for the detailed article. Since I was the guy who triggered it (at least, I had part of it and feel addressed by some of the remarks), please let me say that I was never a ‘gamut-alert freak’ – I just watched your video on the colour calibration module, where you mentioned it, and used it, and I noticed that the settings were not the default ones; that was why I asked.

In what space to show histograms ?

Any. It doesn’t matter. What we look for, when looking at histograms, is the spread. […]
But the truth is, if you wonder in what space to show an histogram, you most likely don’t know how to read it, so don’t bother at all, hide it and look at the picture instead.

I dispute the ‘any’ argument, since the screenshots I posted using your suggested spaces (PQ Rec2020, HGL Rec2020 and sRGB) are dramatically different.

Or maybe I don’t understand what you mean by ‘spread’.

And a personal note: I admit that you and I are not playing in the same league when it comes to colour science and image processing, but please understand that phrasing things like ‘if you wonder in what space to show an histogram, you most likely don’t know how to read it, so don’t bother at all’ sounds condescending and offensive to me. :frowning:

2 Likes

Hi @anon41087856, thanks for the explanation, I appreciate it!
When I find RGB values below zero, does that mean that the initial conversion from camera space to working space failed?
How would you handle negative rgb values in darktable?

The black point is read from the dark pixels, a special area of the sensor that doesn’t catch light, and extracted by Rawspeed. You will find it in the raw black/white point module. Sometimes, it’s too aggressive.

You can fix that either in raw black/white point module or in exposure module, with the black level. Basically, you add some constant until your whole range gets positive.

However, the medium black threshold I’m talking about is not zero, but much higher. So even with no negative RGB in the pipeline, if you don’t have black point compensation on, you will still have black clipping on the print.

No reasoning, it’s just not implemented. There is a comment in the code saying it should maybe be exposed to user interface. Also, it’s enabled without telling you if you enable softproofing.

There is some gamut “safety-jacket” in filmic, but then exports are handled through Little CMS2, which does the perceptual thing to gamut map. So, no manual compression.

Well, sorry, but at some point you need to start at the beginning… What is an histogram for ? What are you trying to see in there ? Once these questions are answered, the choice of space is pretty much done. The point is not to be condescending but just to remind users to disregard scopes they don’t understand because they don’t need them. Retouch the picture, and maybe if you see something wrong in the picture, then scopes can help you troubleshoot the issue and spot the origin of the misbehave. But as long as picture looks good, don’t make up problems.

The only use for an histogram is to see how the image fits in the available range, between medium black and medium white. So you only care about how much space you get on the left and on the right, if there is any space at all, or look for a peak at min/max intensities that would suggest clipping.

Any space will do that. Now, sRGB, because of the gamma/power 2.4 will put middle-grey in the middle of the histogram, if that’s what you want, but will not have an uniform spread because it has actuall 2 gammas (one for midtones + highlights, one for dark tones). The PQ curve from Rec 2020 PQ dilates shadows a lot and compresses highlights, so it gives you better legibility near black and worse near whites. But it’s still the same span, only zoomed differently. The HLG curve is more balanced.

Whatever space you choose doesn’t change the meaning of the histogram span : you have some range between white and black, you want to see how the pictures fits in that range, period. The shape of the histogram will change with the OETF of the space you choose, but the shape holds no meaning at all, so it does not matter.

3 Likes

It’s a long one, sorry.

It wasn’t what you said, it was how you said it. But I understand that you and the other knowledgeable people on the forum get lots of questions; some repeated, and surely most way below the level that’s interesting to you. Also, I sometimes fail to remain patient when people ask me or make stupid mistakes, so I can understand your side of the situation, too.

Anyway, I am grateful for your work: both the development and the education (I have also taught software engineers, and know how much time and energy go into even the simplest demos, let away video tutorials).
I think I do understand what a histogram is for; but, like some others, for a long time I simply edited JPGs in the Gimp and similar tools, where I never had to think about colour spaces. Even in RawTherapee (which was my first raw developer, back in the days of Gábor Horváth, who started the project), you could not (at that time; not sure about now) switch the colour space for histograms. In fact, it is a relatively new feature in darktable, too.

Exactly because normally I’m not a ‘gamut freak’ (and I only use sRGB and the display for output, never print), I never really checked the options for soft proofing and gamut check (my general attitude being ‘use the defaults set up by the experts unless you know what they do and know why and how to change them’). Then came your video about the new colour calibration module (‘shiny new stuff’ - and yes, I did use it on some Christmas lights, so I appreciate it a lot!). When you fought the red carpet (https://youtu.be/U4CEN0JPcoM?t=2081), I did not see artefacts in the carpet’s surface (maybe it’s me, maybe Youtube, maybe my monitor), but you did turn on the gamut check and switched the soft-proof profile from the default sRGB (you explained that from https://youtu.be/U4CEN0JPcoM?t=2135 up to about 36:50). I have now re-watched that and re-read Color calibration test and some thoughts - #14 by aurelienpierre (‘color profiles and chromatic adaptation may push colors out of visible spectrum’), and Color calibration test and some thoughts - #19 by aurelienpierre (’ we handle gamut mapping in color calibration only to cleanup after the chromatic adaptation because we know it will push colors out of the visible gamut’) where you answer the question perfectly. So that part is now clear.

I’m still puzzled about the validity (or maybe applicability/relevance to my use case) of the PQ Rec2020 histogram, even with the comment ‘The PQ curve from Rec 2020 PQ dilates shadows a lot and compresses highlights, so it gives you better legibility near black’, as it seems to indicate the photo does not make use of the darkest tones, while the HLG histogram actually indicates clipping in the darks (there’s a minor spike on the left end). Do the two spaces have different black points, or do I misunderstand something? (I’ll just continue to use the default sRGB, I think.)

Thanks,
Kofa

1 Like

Don’t use the PQ histogram neither the hlg as they appear to be broken.
Looks at the PQ histogram, it shows that in your image there are a lot of data at 10000 nits and i don’t think this is the case.

Same issue for hlg, however hlg has a pure gamma (2.2 or 2.4) from 0 to middle gray for legacy display, and a logarithmic shape in the upper part, so i don’t really see why it should be better than srgb.

I checked the dictionaries but keep on stumbling on the meaning of “primaries”…

…and then I’m all lost… Maybe I shouldn’t read in a sleepless night?
Thanks for any help

Here you can find a very good description:
https://ninedegreesbelow.com/photography/all-the-colors.html

See Chapter B2 “Chromaticity coordinates”

Thanks @wiegemalt, like I said, I shouldn’t do “insomnia-reading” on the mobile :slight_smile:

It’s also that I’m pinged all over the place since early December, I try to answer everyone quickly, but that often involves reading in diagonal between cooking supper, coding math and spending a bit of time with the wife, so I don’t overthink stuff before writing it. Plus, as many here, English is my third language so when I’m tired, it becomes French translated 1:1, which sounds a lot harsher than what English natives are used to since we don’t do euphemisms.

Bottom line, I apologize, I wasn’t targeting you personally. My point was that histogram, as any other scope, is meant to make your life easier, so if it doesn’t, just disregard it. As I said somewhere, scopes are useful to diagnose where issues come from, as long as the picture looks good, don’t bother.

The “darkest tones” are contextual. The PQ curve really dilates lowlights a lot. First, you don’t need to have true blacks (aka RGB = 0), then it’s all about axes scaling.

sRGB is a safe choice indeed. Black points don’t exist in reference spaces, since they are not tied to a medium, so it’s only a matter of encoding range. And, given that many printing drivers don’t do black point compensation, not having RGB = 0 can be a safety jacket.

And you know that because… ?

HDR formats usually encode RGB over 10 bits, that is 1024 code values. So 1023 encodes 1000 nits. But, for obvious compatibility reasons, all HDR formats encode 100 nits (aka standard SDR white) at 255, which is the max in 8 bits. darktable’s histogram shows the 0-255 range in 8 bits, no matter the color space. So, the 100% of the histogram is still SDR white, thank you very much.

7 Likes

Thank you @anon41087856, this is an excellent post.

I just wish that this (and other educational) post(s) could be preserved as blog-posts — maybe on darktable.org, here on pixls.us or maybe your personal blog.

Over the past year or so I’ve read several of your educational posts and found them very insightful but I’m wondering how many I have missed and how I will find them again in the future.

Maintaining a pinned post “Aurelien’s wisdom” with links to posts like this would work as well.

Thank you for your great work!
Oliver

1 Like

This forum, dakrtable.org, and the official documentation are all open. You’re free to curate these links, write a blog post on dt.org or provide content to dtdocs.

1 Like

With both PQ and HLG, 100cd/m² (nits) is encoded at codevalue 512, not 255. This is very unintuitive but even the SDR range of whichever HDR10 implementation gets more bitdepth…I suspect the much larger bt2020 gamut ‘requiring’ this for there to be no visible banding. Even sRGB primaries can show banding in 8-bit sometimes…makes sense, but is very counter intuitive.

I know you’re active on many fronts – researching, coding and educating. I sometimes wonder if you sleep at all. Kudos, Aurélien, and thanks for all that you do for the community!

4 Likes

The easy fix for the PQ histogram (and color space too) is to divide by 100 in linear rgb before using the PQ equation.

As a simple hack with the exposure tool set to -6.643 EV the histogram is correct.