What does linear RGB mean ?

Possibly. Small N, eg 2 or 3, would improve the situation. For accuracy, we would need larger N, but large polynomials tend to be badly behaved.

I think the obvious method is a 3D CLUT, what ImageMagick calls a hald-clut. Probably 32x32x32 would be large enough. And perhaps they could be compressed with the G’MIC method.

As far as I know, sensor response to light intensity is pretty linear (so long as you avoid saturation/clipping).
I think what @anon41087856 is referring to here is the fact that the RGB filters used in most cameras do not conform to the Luther-Ives (or Maxwell-Ives if you prefer) criterion - the RGB filters are not a linear combination of the CIE colour matching functions so you cannot use a 3x3 matrix to convert from sensor RGB to XYZ.
It is possible to convert to XYZ more accurately if you know the spectral response of the RGB filters and the spectra of the scene. The problem of course is that the scene spectra are usually unknown, so I’m not sure how useful a spectral solution would be in practice.

1 Like

Full answer here :

Sum-up : individually, each sensor R/G/B channel is affine (so, roughly linear) to the amount of photons captured in some part of the light spectrum. But, overall, the spectrum splitting done by the color filter array R/G/B is not uniform, so the spectral sensitivity overlap a lot more than in human vision (for example, the “green” channel is sensitive to almost the whole 400-800 nm range).

The proper way to profile sensors would be to map each sensor RGB vector to a light spectrum (using LUT or digital reconstructions), then simulate a digital retina to map spectrum to XYZ space, from which all standard RGB spaces derive. But for this, we need databases of the spectral sensitivity of each CFA/camera.

This also makes white balance adaptation super easy and way more accurate.

Exactly.

1 Like

Thanks Aurélien, that is an interesting article. It answered my question about shutter speed, but for the rest it seems maybe I was asking the wrong question :slight_smile: I’ll take some time to digest and read up some more.

What I “cannot” or maybe more appropriately, “should not” do isn’t helpful; I’m more interested in “how bad is it?” and “what would one do alternatively?”, questions whose answers I can more constructively consider…

On the other side of the color conversion pipe are the output profiles, associated with particular devices. How bad is that correlation? We need to remember it’s about characterizing cameras in ways that provide reasonable transforms to output spaces.

@paulmiller, I’m not chewing on you; you just had a quoteable quote… :grin:

For RGB spaces that are defined from XYZ primaries (hence device independent), everything is fine by design. That is sRGB, Adobe RGB, ProphotoRGB, etc. The issue arises when you want to map a device-dependent RGB to XYZ, and get worse when gamut shrinking occurs.

Going from and to device-dependent spaces from and to device-independent spaces (mostly XYZ) using 3×3 matrices is sort of ok as long as your pictures are lit with a D50 or D65 illuminant (aka “white” is white) and for colours mildly saturated.

Things go bad with blue LED lighting (aka “white” is blue), red sunsets (aka “white” is red), etc. because you violate the premises of the whole ICC pipeline (“white” is white). Fixing these takes usually a large amount of fiddling, and the only clean way to do it would be to use spectral as a connecting space, and the medium spectral sensitivity as a conversion.

1 Like

For my D7000, I found a spectral dataset someone presumably collected from measurements with the appropriate test fixture. dcamprof has a workflow to use these to make a camera profile, but that still will use a ZYZ PCS, no? Alternatively, what would be the math?

http://rawtherapee.com/mirror/dcamprof/dcamprof.html#workflow_ssf

Edit: Spectral source:

They have data for 11 cameras…

Spectral profiling is definitely not something to be let to users to do. At very least, you need to either use a radiometer or a standard D illuminant bulb, to serve as a standard against which you compute the transmittance of the CFA. That’s a job for a proper lab with proper equipment and a proper team.

1 Like

Definitely. But it’s a problem of a similar order to that of producing target shots for ICC/DCP profiles; hard to do, but done once.

By the way, I checked a bit more about dcamprof and the spectral calibration seems completely bogus. They use absolute spectral reflective references (http://www.babelcolor.com/index_htm_files/CC_Avg30_spectrum_CGATS.txt) for the color checker which don’t mention the illuminant. Reflective references are not absolute, they only reflect a part of an emitted light spectrum, so they fully depend on what light emission they get in the first place (reflexion = emission - absorption). Basically, not accounting for the illuminant means the reference is garbage. Yet another glorious piece of opensource.

The actual (not sarcastic) glory here is that you can actually read it in the code and know what is happening in the software… Whether the software is doing the correct thing or not is another issue completely and has nothing to do with the software being open or not. :wink:

8 Likes

I actually didn’t even open the code to find that. And I’m certainly not willing to. Also, having access to the source doesn’t automatically imply it’s readable.

I spent the evening using dcamprof to make a SSF camera profile for my Nikon D7000. JSON… what fun. Anyway, here’s an image with extreme blues, the only color transform is SSF-camera → sRGB:

Same image, same processing, only difference was the camera profile was the D65 camconst.json primaries from RawTherapee:

Edit: The SSF profile has, instead of the 3x3 matrix, an AtoB0 LUT…

1 Like

Reflectance is a fraction (not a difference) of the radiant flux reflected over the radiant flux recieved by that surface. It is in this case a spectrally resolved dimensionless number. I don’t see a problem here (unless reflective reference means NOT reflectance). You want that because it IS independent of the Illuminating light source (If you do not have light emission in a certain spectral range, that makes that range undefined, but assuming some emission at every wavelength, you can calculate that fraction).

EDIT:
BabelColor
Under their patch tool description they specifically say ‘Supported device-independent data types: Reflectance Spectrum,…’
I think those guys know what they are doing!

3 Likes

Normal practice when dealing with reflectance data is to multiply the reflectance values by the illuminant to recover the reflected spectrum (reflectance = reflected light / incident light). Its not garbage, its standard CIE practice. This of course assumes that reflections are lambertian and there is no fluorescence.
If you had read the code, you would see that this is what dcamprof does with the target data.

2 Likes

Compute your reflective metric the way you want (difference of energy or dimensionless ratio), it’s still only a metric that represents the same thing : we only reflect a part of what was emitted (conservation of energy).

As a matter of fact, the reflectance ratio is dependent from the incident light wavelength (like refraction, diffraction, and pretty much every wave-related stuff), so discarding the illuminant is yet another case of “fair enough”, provided your illuminant is not too far from D50 on D65.

Standard CIE practice or not, it doesn’t solve our problem : it’s still a gross approximation under some hypotheses, which your soft cannot assert. We are back to step 1 : ask user to check himself if his picture has been shot in standard lighting conditions and if the colour pipeline hypotheses are respected, but we are not allowed to do that by the Holy Church of the Intuitive UX.

No, discarding the illuminant is not “fair enough”. The illuminant is ignored, and this is deliberate because it is not relevant. For example, sample “1” reflects 0.05475, about one-twentieth, of any light at 380nm it receives. If the illuminant contains no light at that frequency, then none will be reflected.

EDIT to explicitly deny “provided your illuminant is not too far from D50 on D65.” The reflectance ratio of a surface is a constant, irrespective of illuminant.

3 Likes

Energy is conserved! The sum of energy transmitted, absorbed and reflected is constant, for every wavelength by the way. Nobody except you claimed otherwise.

How is knowing the whole reflectance spectrum(!) an approximation? You can deduct from that, for every lightsource, not just standard Illuminants, how much of that light is reaching your CFA, again, with full spectral resolution! What more do you want to know from a colorpatch?

No. Even NON standard illumination can be dealt with, because you can calculate it for every wavelength.

What are CIE standard practises then for?

Except your chart is lit by a spectrum, not by a laser, and what the camera records is an absolute quantity, not a ratio. So when you profile the CFA, the measurement you get is say 0.85, and knowing it’s 0.05475 times some emission times sensor sensitivy makes 2 unknowns in your equation.

I didn’t claimed otherwise. I said reflexion = emission - absorption, to which you answered that reflexion is reflectance, hence a ratio, to which I answered make it a difference or a ratio, it’s still only a metric massaged in a way that suits your use.

Because when you shoot your color target with your camera, you don’t know which spectrum is lighting it, unless you have a D65 bulb on your shelf. If you don’t know what emission you have, then you don’t know what reflection you have, then any attempt to measure the spectral transmittance of your CFA is doomed.

Indeed. It’s not my problem. When you calibrate something, you need to know against which standard. Again, if that software is shipped with a D65 bulb, I have no issue. As long as it’s not the case, you calibrate your camera against a surprise standard. Again, your sensor records a quantity, not a ratio, so you have to know by what emission your target reflectance ratios are multiplied to produce that quantity.

For people working under ISO 12647 or ISO 3664 conditions from one end of the graphic pipeline to the other. Which is not many people, probably just the guys digitizing paintings for top museums.

It could be lit by a laser or LED or moonlight. The illumination makes no difference to the spectral reflectance, which is what the table at http://www.babelcolor.com/index_htm_files/CC_Avg30_spectrum_CGATS.txt shows.

Yes, what the camera records is a function of the illumination spectrum, surface reflectance spectrum and CFA transmittance spectrum. That table addresses only one of those factors.

3 Likes