Mathematically right values in decomposing to LAB ?

gimp

(Marion Gaff) #21

The aim of my research is to understand the impact of maturity of orange on the essential oil extracted from orange peels. The sign of maturity we are using is the color of the peel; that’s why I would like to characterize precisely the color of orange peels. That would be the first study to link colorimetry with analysis of the components of orange essential oil.
And I want to characterize the color with Lab* values (and not RBG or another color space) because I would like to be able to compute a colorimetric distance between different colors of orange peels (delta E), that represents something to the human eye.
Plus, the study of the statistical distribution of the different pixels would help me to tell if the human eye is accurate enough to select and categorize the different peels.
So basically, I categorize peels according to the color I see. In a group of colour, I take the quantity needed for one essential oil extraction, place the peels in the box, take a photograph with constant camera settings, convert the raw file to 32-bit floating point Tiff sRGB, open with a software able to give me the right values of Lab* for all the pixels of the photo (the background of the peels being a full blue color I can “remove” it).


(Elle Stone) #22

I wrote a set of three articles on making a custom camera input profile. The links are here: https://ninedegreesbelow.com/photography/articles.html#profile-digital-camera

For a general purpose camera input profile which is the most usual type of profile to make, I recommend making a simple linear gamma matrix input profile using ArgyllCMS “-am” parameters. But for your particular use case - constant controlled lighting - I’m guessing a “look up table” (LUT) profile ("-ax" or “-al”) will be a much better choice.

This article (one of the three articles mentioned above) explains the various types of camera input profiles: https://ninedegreesbelow.com/photography/camera-profiles-applied.html - the article also talks about using a xicclu curve to check your profiles, and warns against “over-fitting/curve-fitting” by asking for a very small delta.

The problem with LUT profiles is that they do require a lot of sample points to make a super-accurate profile. But again, in your case I think a LUT will work better. So a target charts with more color patches will be better than a target chart with fewer color patches. This article goes over some of the commercially available charts: https://ninedegreesbelow.com/photography/camera-profile-make-target-shot.html

The last of the three articles talks about how to make the actual camera input profile once you’ve made the target chart shot: https://ninedegreesbelow.com/photography/well-behaved-camera-profile.html

All three articles have lots of links to the pertinent sections of the ArgyllCMS documentation. It’s a lot of information to absorb, but many people on this mailing list have experience making camera input profiles and so can help you out if you get stuck. In addition, the author of ArgyllCMS @gwgill participates in the pixls.us forum, as well as having the ArgyllCMS mailing list and searchable archives which are a treasure trove of information (http://argyllcms.com/mailinglist.html).


(Elle Stone) #23

I’m still a little worried about your lighting setup. It’s probably fine. But if the light is not full spectrum, if it’s “low CRI” lighting such as CRI 85 LED or fluorescent lighting, then metamerism becomes a serious problem.

If you know the light bulb type, you could look it up and see whether it’s good lighting or not - I think there are some people on this forum who have a good handle on the whole CRI issue.

Hopefully the lighting is evenly diffuse, otherwise you’ll have to deal with glare off the target and the oranges.

Also depending on the age/time in use, light bulb performance does change over time. So if you will be taking series of photographs over longish time frames, you might need to make new profiles periodically.

I have no practical experience with this sort of photography and lighting so hopefully someone else can chime in with practical pointers. Well, I shouldn’t say “no experience”. I can confirm that taking photographs under a low-CRI LED bulb is not fun. I had one in my little upstairs “studio” and it was awful. Now I’m using plain old ordinary tungsten lighting along with a big window. So lighting in my “studio” isn’t constant (which of course your situation requires constant lighting), but it’s full spectrum and that counts for a lot.


#24

1. Colour is an important topic in your field. There should be many papers on this subject, though I don’t know how encumbered they are from access and use for insight.

Since your colour range is limited to the peels and somewhat known:
2. I suggest you use colour reference cards in the scene. Just because we live in the digital age doesn’t mean we can’t use older time-tested tools to help with your profiling and analysis. Also, I wonder whether there are tools in your chem lab that can measure colour, or whether you could order one (depending on your funding and connections).

3. Is L*a*b* the best space for the job? The thing about models is that they have limitations. They are not ideal for all uses. Now, I don’t know whether L*a*b* can allow you to calculate accurate ΔEs for the spectrum where orange peels lie. That is up to you to find out.


(Glenn Butcher) #25

Are you looking for specific colors at an instant, or the change in color over time?


(Marion Gaff) #26

I am not sure of what you mean by “colour reference cards”, but I use 2 color patches that are initially intented to calibrate the portable colorimeter BYK that we own. I know the precise L* a* and b* of those color patches.

The portable colorimeter BYK only do measurement of very small areas. The idea of setting up a colorimeter based on image analysis was to be able to determine the dispersion of the L* a* b* values of the totality of the peels by taking just one photo.

CIE Lab* was created to be representative of the human eye. So, it means that the geometric distance between 2 (L,A,B) coordinates (Delta E) is supposed to reflect the difference in color that we perceive through our eye (whereas it is not the case for R G B).


(Marion Gaff) #27

I am looking at specific colors at an instant. We have (at present time) different states of maturity.
A pool of orange of a similar color -> taking a sample of this pool -> taking a photo to analyze the color -> extracting the essential oil. Then I can take another ample of the same pool to verify that I obtain the same results (for the color and for the essential oil extracted).
I do not analyze the changing of color of oranges on trees. I need to extract the essential oil corresponding to the peel color (= each stage of maturity).


#28

I am thinking of colour references like the munsell soil colour chart. You could find a series that covers the spectrum of colours that closely match the peels. They should have known values, provided they are relatively new and high quality. When you photograph the peels and go through the processing, they would tell you when shifts in colour occur so that you know when a correction is necessary.

CIEL*a*b* isn’t perfect, even if you do the colour profiling, transformations, processing, corrections, etc., correctly. That is why RT uses custom LUTs to extend it where it is inaccurate. Depending on your scope, this may or may not be a problem. What I am hinting at here is that there are other colour models out there such as CIECAM02 and derivatives that look into colour in a more comprehensive manner.

Hope my armchair thoughts are helpful. Best of luck!


(Marion Gaff) #29

The professor who was setting up the “colorimeter” bought the Spyder Checkr 24. I don’t think my lab would allow me to buy another target chart with more color patches. Plus, the time that I have to solve everything is only 2 or 3 days! So, I guess I have no choice but to do with the Spyder Checkr 24.

The thing is that the Spyder Checkr 24 is supposed to be used with softwares such as Lightroom, Photoshop or Phocus :disappointed_relieved:

I found that link which gives explanations on how to use Spyder Checkr 24 with Raw Therapee.

Do you think I could manage to do all that in 2 or 3 days ?

I do not know if it is relevant, and if it might help, but I already know the (L*,a*,b*) approximate domains in which the peels will be, and these domains are small. I mean: if the purple is not perfectly calibrated, I think the impact on my L*,a*,b* values will not be very strong.

Thanks a lot


(Graeme W. Gill) #30

So you’ll have to stick to a matrix profile.

Beware that normal cameras usually make poor colorimeters, because their spectral sensitivities aren’t much like the human eye.

You will get the best accuracy if your test chart is spectrally similar to the real world objects that you wish to photograph. Paint chips may or may not have a similar spectral characteristic to orange skin - you would need a spectrometer to figure that out.

In industrial and scientific situations, the correct path would be to use cameras specifically designed to capture colorimetric or spectral information. Such cameras are available, but are likely to be much more expensive than consumer or semi-pro photography gear.


(Alan Gibson) #31

Starting from a camera raw, and wanting a CIELab result, I see no reason to go via sRGB. The approach I would take is:

  1. Use dcraw to de-Bayer, and get an image encoded as XYZ:
%DCRAW% -v -6 -T -o 5 -W -O orange_xyz.tiff from_camera.CR2

I’ve not used the camera white balance (-w option) because, depending on the photography, this might be wrong and/or vary between photos.

I’ve included -W to not automatically brighten the image.

You probably also want “-o file” or “-p file” for the camera profile. I’m not sure which.

  1. With ImageMagick and @Elle’s icc profies (see https://ninedegreesbelow.com/photography/lcms-make-icc-profiles.html), convert to Lab, and from there to sRGB. The sRGB result is just so you can look at and admire your photos.
magick orange_xyz.tiff -profile Lab-D50-Identity-elle-V4.icc +write orange_lab.tiff -profile sRGB-elle-V4-srgbtrc.icc orange_srgb.tiff

[EDIT: I’d forgotten that dcraw assigns an XYZ profile, so there should be no need to assign her XYZ profile with an initial “-profile XYZ-D50-Identity-elle-V4.icc”.]

You can then use ImageMagick on the Lab image to do simple statistics (mean, standard deviation, etc) on all or part of the image, or spit out the pixel values in CSV or other format for external software to analyse.

Elle knows far more about getting accurate colours than I do.


(Alan Gibson) #32

As a side-note, I’ll suggest that a better approach might be to skip the de-Bayering.

Huh? Well, we don’t care about detail within the image. We care only about some kind of average colour. Assuming the camera is RGGB Bayer, we could make a TIFF with no de-Bayering (dcraw options “-o 0 -6 -r 1 1 1 1 -g 1 0 -D”) , then make one output pixel from every four inputs, taking red, blue and average green values from the appropriate corner of the input square.

I’ve never tried this, and don’t know if this would give more or less accurate results.


(Tobias) #33

I would really not use a camera for colour measurement in a scientific research. There are so many places, where you can make mistakes. (light, creating the ICC, problems with the software)
There are special devices, that do nothing more then colour measurement like this one: (First duckduckgo hit.)
https://www.hunterlab.de/produkte/stationaere-spektralphotometer/colorflex-ez.html (German)
https://www.hunterlab.com/colorflex-ez-spectrophotometer.html (English)
https://www.hunterlab.com/solid-fruit-color-measurement.html (English)

You can even rent such devices. But I would expect such devices in a university lab.


#34

My lighting system is composed of 2 Quicktronic Osram Dulux (just like the one on the image) with 4 Osram Dulux L 24W / 12 disposed in a square shape. In the square shape formed by the lights there is a 5 cm thick square of glass (maybe frosted glass) with a round hole in the center. On top of the glass, through the hole is placed the camera lens. (Not very easy to explain but I hope you understand)

From what I could find on the internet, according to Osram The Color Rendering Index (Ra) is superior to 80 or superior to 90 (depending on the precise type of the light). Unfortunately I don’t think I have the time to set up another lighting system.
Although, I found a scientific paper by Leon & al. (2006) untitled “Color measurement in Lab* units from RGB digital images” in which the illumination was achieved with 4 Philips Natural Day-light 18 W fluorescent light (60 cm lenght), with a color temperature of 6500 K and a color index (Ra) close to 95%.
Do you consider that CRI of more than 90 is “low CRI” ? Sorry for these “novice questions”.
Apparently, in this paper, they managed to get reasonably small errors compared to a commercial colorimeter (Hunter Lab)

Yes, we own a portable colorimeter BYK. As I said, this device is only able to give Lab* measurements for really small portions of area (not representative of the color of the whole bunch of orange peels, unless I do 100 measures) and I thought it would be more interesting to get the values of Lab* for all the pixels of the peels. But yes, setting up a colorimeter based on analysis of photos is much more complicated than I thought it would be!

The even diffusion of the light is the purpose of the 5cm thick glass I guess.

I will be taking photos (if hopefully I manage to solve everything out) during only one week.

I have no clue about Bayer and de-Bayer. But my camera is a Canon DS126621 with a Lens SIGMA 17-70mm F2.8-4 DC MACRO (Filter size: 72 mm)


(Elle Stone) #35

My knowledge of CRI is just based on reading stuff readily available on the internet, so that doesn’t make me an expert, just makes me a novice who did a lot of reading maybe ten years ago before purchasing two CRI 92 D65 fluorescents for my tabletop photo studio (since then I replaced those fixtures by more flexible light boxes and spot lights, but that’s irrelevant to the current discussion).

My “takeaway” point from all the reading was that CRI 92 was pretty much a minimum “good CRI” for photography, 95 is really good, and that anything under 90 was not good - the “goodness” scale - new technical term I just made up :slight_smile: - isn’t linear, declines rapidly as the CRI numbers go down. Perhaps @chris can comment on the question of CRI numbers (see Using Hald CLUT to modify CRI of light source?). Point of suspicion regarding these numbers: I find it somewhat suspiciously convenient that the highest CRI fluorescent then available locally (in a specialty lighting store, not in regular stores) had the magical 92 CRI. Maybe published “good enough” figures go up as the commercially available maximum CRI goes up. But from experience I’ll confirm that CRI 85 LED bulbs make awful lighting for taking photographs.

I’m guessing that research is like photography in the sense that if we all sat around waiting until we had the best possible equipment none of us would get anything done. I’m hoping that a good thing that results from this long thread is that you’ll have a working understanding of limitations on accuracy of the data you collect, which of course all data has limitations. The important thing - as I’m guessing you are already very well aware - is to try to figure out what those limitations on the data actually are, and try to provide checks and estimates of error where possible. You have access to a portable colorimeter, so that’s one available check.

Another check is to make a target shot of your Spyder Checkr 24 and go ahead and make a matrix profile. As @gwgill said, there’s just not enough color patches on this target chart to make a LUT profile, and probably not even a shaper matrix profile, but rather just make an “-am” matrix profile. It would be interesting to compare the deltas from the specified LAB values in the SpyderCheckr reference file, to results from your custom camera input profile vs results from the standard default matrix supplied by dcraw, which matrix profile is also available through the various free/libre raw processors.

Another check is to put a solid uniform gray piece of paper or better yet maybe a sheet of white or gray PVC plastic in the space where the product will be, and photograph this blank surface to get an idea of how uniform the lighting actually is and whether there are any hot spots from glare.

@afre raised an interesting question regarding the color of oranges vs LAB and vs taking photographs with a camera. I did some rudimentary checking using a photograph of an orange that I made earlier this year - oddly enough my goal was to get an idea of the LAB values of an orange - I was painting a picture of an orange and wanted an idea of “how orange is an orange”. I also checked the orange photograph’s LAB values against a photograph of an IT8 target chart - some cameras have trouble with high chroma yellow and orange has a lot of yellow in it, but I think actual oranges have a low enough chroma to not wander into the problem areas for camera matrix input profiles. I’ll post images and results in a bit.


(Elle Stone) #36

[quote=“Elle, post:35, topic:9281”]
@afre raised an interesting question regarding the color of oranges vs LAB and vs taking photographs with a camera. I did some rudimentary checking using a photograph of an orange that I made earlier this year . . . and also checked the orange photograph’s LAB values against a photograph of an IT8 target chart - some cameras have trouble with high chroma yellow and orange has a lot of yellow in it, but I think actual oranges have a low enough chroma to not wander into the problem areas for camera matrix input profiles. [/quote]

OK, here’s the orange. The odd white pipe is just some PVC pipe that I put up there to get a quick white balance. The sample points show LCh instead of LAB. LCh is just a simple polar transform of LAB, so if you already have LAB, it’s easy to calculate LCh and vice versa. The reason I show the LCh values is it’s just easier to visualize what’s going on. “h” (hue) is the angle measured counterclockwise from the positive a axis on the CIELAB color wheel. And “C” (chroma) is just the distance from the intersection of the a and b axes, where a=b=0. These values are perceptually uniform (to the degree that LAB itself is perceptually uniform).

OK, what makes LCh easier to visualize for the current purpose is that the problem with simple linear gamma matrix camera input profiles have trouble with bright high saturation yellows, where saturation is defined as the ratio of Chroma to Lightness. Notice the little magenta triangles in the LCh values for sample points 4 and 5 in column 15 and sample point 6 in column 22. Those little triangles mean the color of these patches is “out of gamut” with respect to the sRGB ICC profile color space. This means one of the channels - the blue channel for these bright saturated yellow colors - has a channel value that’s less than 0, a “negative” channel value. The measured LAB/LCH values are still accurate. But such channel values are not good for general editing.

I’m guessing you won’t be photographing any bright yellow oranges because hopefully there isn’t such a thing. The orange colors seem safely within the color gamut a camera can handle.

Edit: Oh, I forgot, here’s the it8 reference file so you can match up the (single-point) Sample point values with the measured values - the values aren’t super close partly because the chart was really old when I photographed it, but it helps give an idea of “how close” the camera profile matches the photographed colors to the original colors on the chart R080505.it8.zip (10.8 KB)

Edit 2: Lost the forest for the trees, sorry! the whole point of the it8 chart and zip file, and the sample points in the column of yellow color patches is that simple linear gamma matrix camera input profiles tend to get increasingly inaccurate, higher errors - greater differences between the nominal color of the patch and the color assigned by the input profile profile to the target chart shot - as you look at the higher saturation yellow colors. This is a separate issue from the fact that these color patches are out of gamut with respect to the sRGB color space (some of the other patches also are out of gamut wrt sRGB, but they don’t have high errors when evaluating the matrix input profile).

Sometimes people will even just edit the relevant text files when making a camera input profile, to just remove one or more of the brightest most saturated yellow patches, which does allow a better match to the remaining colors. I doubt there are colors on the 24-patch Spyder Checkr chart that can be removed - there just aren’t enough color patches. But if oranges were bright saturated yellow instead of orange, I suspect using a photograph to measure LAB values would produce very inaccurate results.

But there is a complication going on. GIMP is an ICC-profile color-managed editing application, which means all the colors are relative to the D50 white point. And I used the white PVC pipe to white balance the image to D50. But when I was matching the actual color seen on the screen visually to the color of the actual orange sitting next to the screen, illuminated by halogen track lighting, getting a visual match did require using colors of orange on the screen that are outside the sRGB color gamut.

I’m guessing that science and technology publications expect LAB/LCh values that are relative to D65. And I’m guessing that if there is a monitor in your workplace, and if it’s calibrated at all, it’s also calibrated to have the D65 white point (and hopefully it’s also profiled so you can actually see accurate colors, but that’s another issue outside the scope of this thread).

Also some of the software you are using is ICC profile color managed, and some is not. You can’t just measure LAB values relative to D50 (ICC profile color-managed colors) and expect they will match the same LAB values measured relative to D65. A spreadsheet can be used to chromatically adapt back and forth. But keeping the white point straight is a very important thing to do.

I don’t ever deal with software that doesn’t use ICC profile color management. Maybe @gwgill , @troy_s , @snibgo , @KelSolaar might have advice on this issue.

Speaking of spreadsheets, setting up conversions from RGB to XYZ to LAB or from XYZ to LAB is straightforward. If you use imaging software such as imagemagick, imagej, etc it’s a good idea to check to see if the values that are produced match hand-calculated values.

ArgyllCMS xicclu is 100% reliable for values relative to D50 (ArgyllCMS is an ICC profile application).

@KelSolaar 's “colour” https://github.com/KelSolaar can return values for arbitrary white points including D65. Example “colour” commands for color space conversions are sprinkled through this long and very interesting thread - well, I learned a lot and thought it was very interesting, hopefully other people did too :slight_smile: :


#37

Disclaimer: I didn’t follow the whole thread.

Hm, I am lacking much practical experience regarding photography with LED/fluorescent lights here, but I perfectly agree with what you said. Furthermore, from artificial light source to printed image there are lots of colour issues to be considered. Most cameras are made to give pleasing results and not colorimetric correct numbers, and the colour response (transfer functions) of the bayer pattern and the spectral response of the light source may or may not fit well together. Decomposition of a light spectrum into 3 components based on what we think how the human eye works will always lead to an incomplete representation. IMHO, the science that goes into that part (bayer filter transfer function and what it means for the system) leads to the differences in colour rendering of the different camera brands many people swear on.

For me, this means practically that for photography, the proof is in the pudding and extensive testing is the route to go. For colour measurements, a camera may be the wrong tool. And information that is lost at the illumination or at the bayer pattern cannot be reconstructed, it can only be estimated based on the assumption on human vision, but the application is then limited to human vision as well. Or painted in based on assumptions on the scene.


(Elle Stone) #38

That’s a really nice link - thanks! And thank you @snibgo for the examples on this page: http://im.snibgo.com/ - that “Linear camera raw” link seems very interesting. If one’s camera deviates too far from linear response, making and using a simple linear gamma matrix input camera profile isn’t such a good idea. I confirmed the linearity of my old Canon 400D a long time ago, but never did bother to check my Sony A7.


(Elle Stone) #39

The Spyder Checkr 24 is listed as a supported target chart for use with ArgyllCMS: http://argyllcms.com/doc/Scenarios.html#PS2

The software mentioned in the “SpyderCheckr possible ?” thread uses ArgyllCMS “under the hood” so to speak. So whatever the manufacturer of the target chart intended, it can be used without having access to Lightroom/PhotoShop/Phocus.

As long as you photograph everything you need, the oranges, the target chart, hopefully also a uniform white or gray surface for checking uniformity of lighting, anything else belong here? then the actual making of the camera input profile can be deferred, isn’t necessary until you actually start processing the raw files.

The biggest thing that will make the target chart shot not as good as it otherwise would be is glare on the chart, which hopefully there won’t be. You can tilt the chart if need be - ArgyllCMS can compensate for a tilted chart. The next biggest would be light fall-off towards the edges of the frame - keep the chart and hopefully the product centered in the middle of the frame and only fill about half the frame to minimize light fall-off, and include a shot of a plain surface to get an idea of how even the overall lighting is.

Also people tend to underexpose their target chart shot. To avoid this “underexposed target shot” issue, a lot of people will set up the camera and target chart, and then shoot a series of exposures, each time raising the exposure until they are 100% sure that portions of the target are blown, and then make a few more even longer exposures just to be sure - the little image shown on the LCD is almost never a good guide to optimal exposure.


#40

I vaguely recall Lab being Illuminant C, but don’t quote me.

The discussion of capturing the scene colours is sort of a misnomer. For a single source, fully diffuse subject, the albedo peak reflectance is captured at 1.0. This of course is rarely to never in the capture of a typical photograph.

Given that a scene colour is defined by the ratios of light, it is fundamentally impossible to display them on a screen given the above facet as we can’t replicate the ratios. It would be of course possible to take the scene referred ratios to a JzAzBz encoding model and attempt to compress the dynamic range and then communicate the colour back into the ratios, but that too likely ends up aesthetically “odd” relative to historical precedent.

TL;DR: It’s impossible to replicate the scene’s colorimetry in reduced dynamic range scenarios without trade offs.