The Quest for Good Color - 1. Spectral Sensitivity Functions (SSFs) and Camera Profiles

This is the stupid white balancing which divides xyY data by the illuminant xy.

EDIT: this was the CIECAM02 adaptation. Stupid XYZ scaling gives this:


See here the full comparison:


1 Like

Wow. Okay, I’ll have to think a bit about why the adaptations push the non-real primaries further away from the locus (and from the camera-primaries). I would have somehow naively expected the other way round.

Why is Troy Sobotka not on this forum again?

Super interesting. The Blackmagic pocket had a sensor without an IR filter stack which caused quite some problems (reddish blacks, color shifts). 390nm and below is for many glass-types a soft cutoff, so sensors typically don’t need additional UV filters. Still very interesting. Thanks.

The geometric interpretation of a white balance is a 3D shear mapping.

Long story.

1 Like

Yup, got it! :+1:

edit: a whole slew of whitebalance questions ensued in my head. But those are for other posts. :sweat_smile:

I demonstrated that to myself a while ago, and posted results here:

Based on a workflow described here:

Troy, for all his excellent knowledge, could not conduct himself like an adult. He found it to be too difficult to communicate in a respectful manner.

So not really a long story at all. He had many, many chances.

Civility is a requisite here.

1 Like

He was nice one-on-one (when I PMed him); still helping the community in his own way. Anyway, I like this topic. :slight_smile:

1 Like

Hey @aurelienpierre, I’m not quite following you on this issue. I confess I do not understand the examples well, it would be appreciated if you could simplify it for me. I am under the impression that normal camera sensors have filter stacks which filter out almost if not all UV and near-IR.

What is happening that is causing the massive shift in the primaries in your example?
Does RawTherapee do this kind of “stupid white balancing”?
What will these colours look like then after needing to be clipped to the working space? Will they be clipped to pure saturation/black/white depending on what they were?

How does this affect normal photography? Is this related to the issue of the extreme difficulty of photographing certain super saturated, dark blue or violet flowers and having the photo’s colours match what we see?

Beware of terminology. In normal usage, “white balance” changes pixels without changing the primaries (it is a simple multiplication of RGB channels), but “chromatic adaptation” does change the primaries. In this thread, when @aurelienpierre writes “stupid white balancing” I think he means “chromatic adaptation by XYZ scaling”. But I could be wrong, of course.

See Chromatic adaptation - Wikipedia and Welcome to Bruce Lindbloom's Web Site

1 Like

@samuelchia RT’s CIECAM02 module allows you to adapt. It can affect other modules such as Tonemap. Whatever @jdc has worked or is working on: e.g., newlocallab branch. ART may have this too (ask @agriggio).

Thanks @afre. I almost never use the CIECAM02 module for any of my editing and I have never encountered the extreme effects which @aurelienpierre is showing. Or maybe it’s somehow escaped me without my knowing :S.

My understanding from my reading of Anders Torger’s writings on camera profiling is that adaption does occur but again I’m not sure I ever encountered such extreme distortion of colours relating to the camera’s sensitivity to UV light. There is something I’m clearly not understanding which I can hopefully learn about.

@snibgo Yeah, the terminology is confusing for me. I see that @aurelienpierre has a history of being pedantic (I don’t meant this in a negative way at all) about terminology, I just wanted to understand what he is saying better. I looked at some of the past discourse and I have found his writing difficult to understand, and I imagine it might be too for those who are missing one more more pieces of key knowledge. Obviously he is extremely knowledgeable and I would be happy to learn.

In digital, white balance is the dumbed GUI name for what chromatic adaptation does. Changing the pixels or changing the primaries is the same, at the end of the day, since it’s only vector algebra. White balance is CAT. In film, white balance was done with tuning directly the RGB filters, aka the primaries. Stupid white balancing is indeed XYZ scaling, aka the most simple yet fairly inaccurate way.

The massive shift happening is simply the consequence of a white balance/CAT. It depends of course what the original illuminant of the scene is, but the same kind of distortion happens everytime. If you have pixels in the deep blue-purple-UV region, they will look like fully saturated color blobs with no details or gradients. If you don’t have pixels in that region, then you might not see it, but it still happens.

UV and IR filters don’t entirely remove them. But the main problem is the 3 primaries, that are the solution of a linear mapping problem (color checker RGB → XYZ space) which doesn’t care about bounds. The camera primaries, stored in the input matrice, only care about moving camera RGB accurately in the central part of the visible locus, not about staying in that visible locus, which would require the spectral goodness @ggbutcher is trying to unleash here.

… and I turn to the right to regard the Rube Goldberg contraption perched on the corner of my desk and say, “you’d better clean up your act, buddy…” :smiley:

Actually, I have some data, so installment 2 is plausible within a week or so…

@ggbutcher , great post, I think I pretty much follow what you said, though not some of the follow-up posts which are def. beyond me!

Perhaps you’ve heard of Bill Claff, if not he does a lot of measurement, frequently posts in DPReview forums and has a site -
I just had a quick skim but no obvious spectral data except for this -
Spectral Response of Bayer and X3

Looking forward to part 2!

How about this? It’s pretty cheap compared to others.

1 Like

Yes indeed, much better…

I just captured Nikon D5600 spectra from my son’s new camera, do enjoy the simplicity of two images vs dozens required for a monochromator campaign, but the lure of “lab-grade” still haunts… :laughing:

1 Like

I’d like to make my own setup one day. I own the spectrometer already, but I don’t know if it can be used for anything besides monitor calibration (software wise).

A spectrometer is probably my next major purchase. Thinking about its uses, in addition to monitor calibration:

  • Re-measure an old ColorChecker to update the calibration data
  • Make one’s own charts for specific colorations
  • Measure light spectra; IMHO more important lately with all the LED mixtures being promoted as “white”…
1 Like