The Quest for Good Color - 1. Spectral Sensitivity Functions (SSFs) and Camera Profiles

Glenn, I stand corrected! Matrix profiles can cause clipping (negatives values) in the quote from Anders Torger that I copied above. Silly me :slight_smile: I do apologise for spreading false information.

Cameras don’t do gamut compression at all, since they even record “non-colors”, outside of the visible spectrum. Yet, these non-colors are encoded with valid RGB values (in [0 ; 1]). Then, the white balance distorts the whole camera RGB range in a way that can produce even more non-colors.

The matter of gamut mapping at camera profile stage is you need to map things at the beginning of the pipeline too, in order to be sure that any RGB value you get before pushing pixel lies in the visible spectrum. Otherwise, good luck with non-white illuminants, blue LEDs and such.

1 Like

This:

in combination with this:

Should give a good Idea why SSFs can be quite useful at times I guess. Mitigating IR contamination for example probably should be done at the stage Camera-space to connecting-space(XYZ mostly). And I would be very surprised if a CC24 or IT8 target lets you derive near IR responses for that. I think the torger links provide some reflectance spectra of those targets…

edit: yes it is in the section ‘choosing test targets’. All but two patches have a flat broadband response in the near IR part of what is shown, that would make the IR-rejection calculation prone to large errors I think.

The biggest issue is with UV and the whole blue to purple region. IR are less of an issue. See here, for my camera (“adapted” is after white balance, the dots are the respective white points):
image (image from Troy Sobotka)

1 Like

…what does it look like if you don’t do the “clever” white balancing and just use xyz multipliers? shouldn’t that scale the colours away from zero and leave the triangle of primaries as it is?

This is the stupid white balancing which divides xyY data by the illuminant xy.

EDIT: this was the CIECAM02 adaptation. Stupid XYZ scaling gives this:

image

See here the full comparison:

image

1 Like

Wow. Okay, I’ll have to think a bit about why the adaptations push the non-real primaries further away from the locus (and from the camera-primaries). I would have somehow naively expected the other way round.

Why is Troy Sobotka not on this forum again?

Super interesting. The Blackmagic pocket had a sensor without an IR filter stack which caused quite some problems (reddish blacks, color shifts). 390nm and below is for many glass-types a soft cutoff, so sensors typically don’t need additional UV filters. Still very interesting. Thanks.

The geometric interpretation of a white balance is a 3D shear mapping.

Long story.

1 Like

Yup, got it! :+1:

edit: a whole slew of whitebalance questions ensued in my head. But those are for other posts. :sweat_smile:

I demonstrated that to myself a while ago, and posted results here:

Based on a workflow described here:

Troy, for all his excellent knowledge, could not conduct himself like an adult. He found it to be too difficult to communicate in a respectful manner.

So not really a long story at all. He had many, many chances.

Civility is a requisite here.

2 Likes

He was nice one-on-one (when I PMed him); still helping the community in his own way. Anyway, I like this topic. :slight_smile:

1 Like

Hey @anon41087856, I’m not quite following you on this issue. I confess I do not understand the examples well, it would be appreciated if you could simplify it for me. I am under the impression that normal camera sensors have filter stacks which filter out almost if not all UV and near-IR.

What is happening that is causing the massive shift in the primaries in your example?
Does RawTherapee do this kind of “stupid white balancing”?
What will these colours look like then after needing to be clipped to the working space? Will they be clipped to pure saturation/black/white depending on what they were?

How does this affect normal photography? Is this related to the issue of the extreme difficulty of photographing certain super saturated, dark blue or violet flowers and having the photo’s colours match what we see?

Beware of terminology. In normal usage, “white balance” changes pixels without changing the primaries (it is a simple multiplication of RGB channels), but “chromatic adaptation” does change the primaries. In this thread, when @anon41087856 writes “stupid white balancing” I think he means “chromatic adaptation by XYZ scaling”. But I could be wrong, of course.

See Chromatic adaptation - Wikipedia and Welcome to Bruce Lindbloom's Web Site

1 Like

@samuelchia RT’s CIECAM02 module allows you to adapt. It can affect other modules such as Tonemap. Whatever @jdc has worked or is working on: e.g., newlocallab branch. ART may have this too (ask @agriggio).

Thanks @afre. I almost never use the CIECAM02 module for any of my editing and I have never encountered the extreme effects which @anon41087856 is showing. Or maybe it’s somehow escaped me without my knowing :S.

My understanding from my reading of Anders Torger’s writings on camera profiling is that adaption does occur but again I’m not sure I ever encountered such extreme distortion of colours relating to the camera’s sensitivity to UV light. There is something I’m clearly not understanding which I can hopefully learn about.

@snibgo Yeah, the terminology is confusing for me. I see that @anon41087856 has a history of being pedantic (I don’t meant this in a negative way at all) about terminology, I just wanted to understand what he is saying better. I looked at some of the past discourse and I have found his writing difficult to understand, and I imagine it might be too for those who are missing one more more pieces of key knowledge. Obviously he is extremely knowledgeable and I would be happy to learn.

In digital, white balance is the dumbed GUI name for what chromatic adaptation does. Changing the pixels or changing the primaries is the same, at the end of the day, since it’s only vector algebra. White balance is CAT. In film, white balance was done with tuning directly the RGB filters, aka the primaries. Stupid white balancing is indeed XYZ scaling, aka the most simple yet fairly inaccurate way.

The massive shift happening is simply the consequence of a white balance/CAT. It depends of course what the original illuminant of the scene is, but the same kind of distortion happens everytime. If you have pixels in the deep blue-purple-UV region, they will look like fully saturated color blobs with no details or gradients. If you don’t have pixels in that region, then you might not see it, but it still happens.

UV and IR filters don’t entirely remove them. But the main problem is the 3 primaries, that are the solution of a linear mapping problem (color checker RGB → XYZ space) which doesn’t care about bounds. The camera primaries, stored in the input matrice, only care about moving camera RGB accurately in the central part of the visible locus, not about staying in that visible locus, which would require the spectral goodness @ggbutcher is trying to unleash here.

… and I turn to the right to regard the Rube Goldberg contraption perched on the corner of my desk and say, “you’d better clean up your act, buddy…” :smiley:

Actually, I have some data, so installment 2 is plausible within a week or so…