Mathematically right values in decomposing to LAB ?

[quote=“Elle, post:35, topic:9281”]
@afre raised an interesting question regarding the color of oranges vs LAB and vs taking photographs with a camera. I did some rudimentary checking using a photograph of an orange that I made earlier this year . . . and also checked the orange photograph’s LAB values against a photograph of an IT8 target chart - some cameras have trouble with high chroma yellow and orange has a lot of yellow in it, but I think actual oranges have a low enough chroma to not wander into the problem areas for camera matrix input profiles. [/quote]

OK, here’s the orange. The odd white pipe is just some PVC pipe that I put up there to get a quick white balance. The sample points show LCh instead of LAB. LCh is just a simple polar transform of LAB, so if you already have LAB, it’s easy to calculate LCh and vice versa. The reason I show the LCh values is it’s just easier to visualize what’s going on. “h” (hue) is the angle measured counterclockwise from the positive a axis on the CIELAB color wheel. And “C” (chroma) is just the distance from the intersection of the a and b axes, where a=b=0. These values are perceptually uniform (to the degree that LAB itself is perceptually uniform).

OK, what makes LCh easier to visualize for the current purpose is that the problem with simple linear gamma matrix camera input profiles have trouble with bright high saturation yellows, where saturation is defined as the ratio of Chroma to Lightness. Notice the little magenta triangles in the LCh values for sample points 4 and 5 in column 15 and sample point 6 in column 22. Those little triangles mean the color of these patches is “out of gamut” with respect to the sRGB ICC profile color space. This means one of the channels - the blue channel for these bright saturated yellow colors - has a channel value that’s less than 0, a “negative” channel value. The measured LAB/LCH values are still accurate. But such channel values are not good for general editing.

I’m guessing you won’t be photographing any bright yellow oranges because hopefully there isn’t such a thing. The orange colors seem safely within the color gamut a camera can handle.

Edit: Oh, I forgot, here’s the it8 reference file so you can match up the (single-point) Sample point values with the measured values - the values aren’t super close partly because the chart was really old when I photographed it, but it helps give an idea of “how close” the camera profile matches the photographed colors to the original colors on the chart R080505.it8.zip (10.8 KB)

Edit 2: Lost the forest for the trees, sorry! the whole point of the it8 chart and zip file, and the sample points in the column of yellow color patches is that simple linear gamma matrix camera input profiles tend to get increasingly inaccurate, higher errors - greater differences between the nominal color of the patch and the color assigned by the input profile profile to the target chart shot - as you look at the higher saturation yellow colors. This is a separate issue from the fact that these color patches are out of gamut with respect to the sRGB color space (some of the other patches also are out of gamut wrt sRGB, but they don’t have high errors when evaluating the matrix input profile).

Sometimes people will even just edit the relevant text files when making a camera input profile, to just remove one or more of the brightest most saturated yellow patches, which does allow a better match to the remaining colors. I doubt there are colors on the 24-patch Spyder Checkr chart that can be removed - there just aren’t enough color patches. But if oranges were bright saturated yellow instead of orange, I suspect using a photograph to measure LAB values would produce very inaccurate results.

But there is a complication going on. GIMP is an ICC-profile color-managed editing application, which means all the colors are relative to the D50 white point. And I used the white PVC pipe to white balance the image to D50. But when I was matching the actual color seen on the screen visually to the color of the actual orange sitting next to the screen, illuminated by halogen track lighting, getting a visual match did require using colors of orange on the screen that are outside the sRGB color gamut.

I’m guessing that science and technology publications expect LAB/LCh values that are relative to D65. And I’m guessing that if there is a monitor in your workplace, and if it’s calibrated at all, it’s also calibrated to have the D65 white point (and hopefully it’s also profiled so you can actually see accurate colors, but that’s another issue outside the scope of this thread).

Also some of the software you are using is ICC profile color managed, and some is not. You can’t just measure LAB values relative to D50 (ICC profile color-managed colors) and expect they will match the same LAB values measured relative to D65. A spreadsheet can be used to chromatically adapt back and forth. But keeping the white point straight is a very important thing to do.

I don’t ever deal with software that doesn’t use ICC profile color management. Maybe @gwgill , @anon11264400 , @snibgo , @KelSolaar might have advice on this issue.

Speaking of spreadsheets, setting up conversions from RGB to XYZ to LAB or from XYZ to LAB is straightforward. If you use imaging software such as imagemagick, imagej, etc it’s a good idea to check to see if the values that are produced match hand-calculated values.

ArgyllCMS xicclu is 100% reliable for values relative to D50 (ArgyllCMS is an ICC profile application).

@KelSolaar 's “colour” KelSolaar (Thomas Mansencal) · GitHub can return values for arbitrary white points including D65. Example “colour” commands for color space conversions are sprinkled through this long and very interesting thread - well, I learned a lot and thought it was very interesting, hopefully other people did too :slight_smile: :