More on LCH and JCH

It’s a little known fact that one of my primary reasons for setting up pixls was also for me to learn from people far, far smarter than I on exactly these types of topics. :wink:

I’m not even remotely qualified to speak with any authority on the subject other than to say that I am ridiculously thankful for those that can grok these topics and make them available to us laypersons! Honestly, I’m probably nowhere close to pushing the boundaries of what sRGB provides for me for the most part (artistically for results that make me happy). But I’m always lurking to learn!

1 Like

What @Carmelo_DrRaw says is of course correct, except for the word “saturation”. “Saturation” in image editing has far too many definitions:

  • We all use the word “saturation” loosely speaking to refer to “more or less colorful colors”.

  • We also use it technically to refer to “Saturation” calculated using HSL, HSI, etc. Let’s put HSL/HSI/etc to one side as not relevant to a discussion of LCH - as @Carmelo_DrRaw notes, these HSL/HSI/ect saturation values change every time you change the RGB working space, and this includes changes to the color space TRC as well as changes to the RGB primaries.

  • The word “saturation” has a precise definition in color appearance models such as CIECAM02, and that definition is different than the definition of “chroma”. This page by Mark Fairchild presents some nice definitions of colorfulness, saturation, and chroma as used in color appearance models:

http://www.rit-mcsl.org/fairchild/WhyIsColor/Questions/4-8.html

Here’s the TOC for the whole series of “WhyIsColor”, which strives to hit the “explain it to me as if I were five” sweet spot, but I still find the explanations of chroma, colorfulness, and saturation difficult to visualize:

http://www.rit-mcsl.org/fairchild/WhyIsColor/map.html

LAB/LCH is not a color appearance model (it’s a “color difference” model), so “colorfulness” technically doesn’t apply to LCH. That doesn’t keep people from coming up with equations for colorfulness in LAB/LCH. This Wikipedia page gives a nice equation for calculating “colorfulness” - or maybe its really “saturation” - in the LAB/LCH color space. The page also has some very confusing (to me) sections on other topics:

In the LCH color space, LCH blend modes and color pickers allow to keep Chroma constant while changing Lightness or Hue. But keeping the appearance of colorfulness/saturation constant is a different matter, requiring some workarounds. In GIMP, if you want to change tonality of an image and also keep colorfulness constant (or is it saturation? the definitions are not easy for me to visual!), use Luminance blend to blend the new tonality with the original image colors.

OK, putting all this technical stuff to one side, what does any of this mean for actual image editing? Here’s an example using GIMP-2.9’s Lightness and Luminance blends to show the difference between LCH Chroma and colorfulness/saturation (apparently you have to click on the image to all of it - the “4. Luminance blend” version is at the bottom):

What you can’t tell from the above image is that some of the colors in the umbrellas in images #1 and #4 (but not in #3) are out of gamut with respect to the sRGB color space. One could use a mask to blend in some of the result of the Lightness blend, to bring the Luminance blend colors all back into gamut. The good news is that GIMP 2.9 (default, not yet my patched version) now has a really nice clip warning for floating point images.

If anyone wants to experiment with color appearance model terminology by changing the various parameters, the RawTherapee CIECAM02 module allows you to do just that, which might help quite a lot in acquiring a practical understanding of the difference between saturation, chroma, and colorfulness, and various other color appearance model terms. Plus the sliders and curves in the RT CIECAM02 model are just plain fun to experiment with.

1 Like

Bookmarked :slight_smile:.

Yes, the wiki entries on color could be confusing due to wording, typos and factual inaccuracies.

I think that answers my question. I do have both GIMPs installed. It is just that it is rather taxing on my system (and my stamina) to switch between applications repeatedly.

Argh! Sorry, I messed up! I forgot that

gmic h rgb2lab

    rgb2lab (+):
                        illuminant={ 0=D50 | 1=D65 } |
                        (no arg)

gmic 50,50,1,3 fill_color 0,0,255 rgb2lab 0 lab2lch s c k.. p

whereas

rgb2lch --> rgb2lab 1 lab2lch

∴ 0,0,255 → 133.80 131.208, or if you keep more figures, 131.20828247070312.

1 Like

Cool - I’m really impressed by people who can use gmic from the command line! And the way you redid the calculations, the result matches GIMP/xicclu - yes?

Several years ago when I spent some quality time exploring the GIMP gmic filter options, the filter that I used to make a more or less finished image (with some subsequent tonal adjustments in GIMP) was an anisotropic noise reduction filter. For some reason I find anisotropic noise reduction very pleasing in the way it “brushes away” details. FWIW, here’s a copy of the image - the tonality looks darker than I intended when viewed against the white pixls.us background, but here is it anyway:

Essentially the same, except for the sixth decimal figure, which we don’t need to worry about.

The problem with CLI is that it is susceptible to typos and human oversight. In this case, I neglected to consider the fact that rgb2lch uses the D65 illuminant.

The advantage is that I have more control over what G’MIC does. I.e., in GIMP, many G’MIC filters do indeed assume sRGB (and some 256 levels), which may be the primary reason for your misgivings re G’MIC.

(However, it goes without saying that it is difficult for anyone to make something that anticipates all possible conditions, like GIMP evolving to support higher bit depths. Commercial software like Matlab do a better job because its users pay good money to depend on it.)

That is not to say that G’MIC does not have its quirks in CLI as well, but this is true for all apps. No app can output exactly the same images.

I can’t say I like the result there but then I don’t have the before image with which to compare. I have seen good uses of anisotropic filtering, it is just that they often come with too many parameters for me to adjust. I tend to use guided for its low parameter count, modernness and cool concept. I still haven’t used it to its full potential. E.g., I still haven’t figured out how it can mask fine details like hair.

Well, to my mind how good or bad an image looked “before” is irrelevant when assessing the image. FWIW, I agree with your assessment - there are aspects of the image that I like, but not enough to that I’d post it in one of my website galleries.

I’ve tried using anisotropic filters in a couple of other images, but never managed to make a finished image that I liked. I blame this on my skill level, not on the algorithm.

Subject #2 (S2)

Now that I am satisfied with S1, on to my next question. I have been wondering about CIECAM02. It appears to be much more complex than CIELAB and has many more parameters. I will start with a general question:

Do you apply CIECAM02 and derivatives like JCH to your workflow? If so, how do you make sense of all of the complexities that go with it? I have only briefly read about it but would like to hear the thoughts of more experienced devs and adventurers :slight_smile:.

1 Like