What does linear RGB mean ?

Spectral profiling is definitely not something to be let to users to do. At very least, you need to either use a radiometer or a standard D illuminant bulb, to serve as a standard against which you compute the transmittance of the CFA. That’s a job for a proper lab with proper equipment and a proper team.

1 Like

Definitely. But it’s a problem of a similar order to that of producing target shots for ICC/DCP profiles; hard to do, but done once.

By the way, I checked a bit more about dcamprof and the spectral calibration seems completely bogus. They use absolute spectral reflective references (http://www.babelcolor.com/index_htm_files/CC_Avg30_spectrum_CGATS.txt) for the color checker which don’t mention the illuminant. Reflective references are not absolute, they only reflect a part of an emitted light spectrum, so they fully depend on what light emission they get in the first place (reflexion = emission - absorption). Basically, not accounting for the illuminant means the reference is garbage. Yet another glorious piece of opensource.

The actual (not sarcastic) glory here is that you can actually read it in the code and know what is happening in the software… Whether the software is doing the correct thing or not is another issue completely and has nothing to do with the software being open or not. :wink:

8 Likes

I actually didn’t even open the code to find that. And I’m certainly not willing to. Also, having access to the source doesn’t automatically imply it’s readable.

I spent the evening using dcamprof to make a SSF camera profile for my Nikon D7000. JSON… what fun. Anyway, here’s an image with extreme blues, the only color transform is SSF-camera → sRGB:

Same image, same processing, only difference was the camera profile was the D65 camconst.json primaries from RawTherapee:

Edit: The SSF profile has, instead of the 3x3 matrix, an AtoB0 LUT…

1 Like

Reflectance is a fraction (not a difference) of the radiant flux reflected over the radiant flux recieved by that surface. It is in this case a spectrally resolved dimensionless number. I don’t see a problem here (unless reflective reference means NOT reflectance). You want that because it IS independent of the Illuminating light source (If you do not have light emission in a certain spectral range, that makes that range undefined, but assuming some emission at every wavelength, you can calculate that fraction).

EDIT:
BabelColor
Under their patch tool description they specifically say ‘Supported device-independent data types: Reflectance Spectrum,…’
I think those guys know what they are doing!

3 Likes

Normal practice when dealing with reflectance data is to multiply the reflectance values by the illuminant to recover the reflected spectrum (reflectance = reflected light / incident light). Its not garbage, its standard CIE practice. This of course assumes that reflections are lambertian and there is no fluorescence.
If you had read the code, you would see that this is what dcamprof does with the target data.

2 Likes

Compute your reflective metric the way you want (difference of energy or dimensionless ratio), it’s still only a metric that represents the same thing : we only reflect a part of what was emitted (conservation of energy).

As a matter of fact, the reflectance ratio is dependent from the incident light wavelength (like refraction, diffraction, and pretty much every wave-related stuff), so discarding the illuminant is yet another case of “fair enough”, provided your illuminant is not too far from D50 on D65.

Standard CIE practice or not, it doesn’t solve our problem : it’s still a gross approximation under some hypotheses, which your soft cannot assert. We are back to step 1 : ask user to check himself if his picture has been shot in standard lighting conditions and if the colour pipeline hypotheses are respected, but we are not allowed to do that by the Holy Church of the Intuitive UX.

No, discarding the illuminant is not “fair enough”. The illuminant is ignored, and this is deliberate because it is not relevant. For example, sample “1” reflects 0.05475, about one-twentieth, of any light at 380nm it receives. If the illuminant contains no light at that frequency, then none will be reflected.

EDIT to explicitly deny “provided your illuminant is not too far from D50 on D65.” The reflectance ratio of a surface is a constant, irrespective of illuminant.

3 Likes

Energy is conserved! The sum of energy transmitted, absorbed and reflected is constant, for every wavelength by the way. Nobody except you claimed otherwise.

How is knowing the whole reflectance spectrum(!) an approximation? You can deduct from that, for every lightsource, not just standard Illuminants, how much of that light is reaching your CFA, again, with full spectral resolution! What more do you want to know from a colorpatch?

No. Even NON standard illumination can be dealt with, because you can calculate it for every wavelength.

What are CIE standard practises then for?

Except your chart is lit by a spectrum, not by a laser, and what the camera records is an absolute quantity, not a ratio. So when you profile the CFA, the measurement you get is say 0.85, and knowing it’s 0.05475 times some emission times sensor sensitivy makes 2 unknowns in your equation.

I didn’t claimed otherwise. I said reflexion = emission - absorption, to which you answered that reflexion is reflectance, hence a ratio, to which I answered make it a difference or a ratio, it’s still only a metric massaged in a way that suits your use.

Because when you shoot your color target with your camera, you don’t know which spectrum is lighting it, unless you have a D65 bulb on your shelf. If you don’t know what emission you have, then you don’t know what reflection you have, then any attempt to measure the spectral transmittance of your CFA is doomed.

Indeed. It’s not my problem. When you calibrate something, you need to know against which standard. Again, if that software is shipped with a D65 bulb, I have no issue. As long as it’s not the case, you calibrate your camera against a surprise standard. Again, your sensor records a quantity, not a ratio, so you have to know by what emission your target reflectance ratios are multiplied to produce that quantity.

For people working under ISO 12647 or ISO 3664 conditions from one end of the graphic pipeline to the other. Which is not many people, probably just the guys digitizing paintings for top museums.

It could be lit by a laser or LED or moonlight. The illumination makes no difference to the spectral reflectance, which is what the table at http://www.babelcolor.com/index_htm_files/CC_Avg30_spectrum_CGATS.txt shows.

Yes, what the camera records is a function of the illumination spectrum, surface reflectance spectrum and CFA transmittance spectrum. That table addresses only one of those factors.

3 Likes

I am sorry to point this out, but your claim was that babelcolor does something wrong. They aren’t. You have been supplied with arguments why they use a reflectance spectrum, which is an absolute reference when you claimed otherwise. And yes! you can know everything about the reflection and still not know anything about the lightsource. That is why it is a good idea to have reflectance as values between zero and one. you can shoot anything at it, you know how the reflectance will transform it. It behaves like a filter (I’d even say it is one). Of course you want to know what the filter does to which wavelength. This is the most versatile thing you can have for a set of color patches in that dimensionless form. And yes of course you want the same information about the CFA in the camera.

This!

That’s nice, but it is about spectral references for test targets. I am, at this point in the discussion, more interested in the camera SSF workflow. Your thoughts on how dcamprof handles that would be interesting to me.

  • The code is undocumented,
  • there is no paper or reference on the manual page,
  • dcam allows to embed a look curve (which has really no place in an input medium profile),
  • it only produces a RGB to XYZ LUT (or 2 if using double illuminant),

so the whole spectral part looks anecdotal to me : it appears to be only a storage format for IT8 charts reflectances later converted to XYZ once you input some illuminant (hoping that your light bulb is close to some reference illuminant). The only place where camera spectral sensitivity is mentioned says that you can use them to generate profiles, but you need to get these from somewhere else, and it will convert them into RGB → XYZ LUTs.

The whole soft is just a more complicated Argyll, there is no real spectral profiling happening in there.

If I had to code it myself, I would probably use this method : https://www.osapublishing.org/oe/abstract.cfm?uri=oe-27-14-19075

3 Likes

:thinking: PDF not loading for me. Will try again later…

1 Like

Loaded fine here

Must be my add-ons and / or internet connection.

1 Like

Doesn’t download for me either, but I’m on the crap computer in the basement. Will try on the better machine later…