Glenn, I stand corrected! Matrix profiles can cause clipping (negatives values) in the quote from Anders Torger that I copied above. Silly me I do apologise for spreading false information.
Cameras donât do gamut compression at all, since they even record ânon-colorsâ, outside of the visible spectrum. Yet, these non-colors are encoded with valid RGB values (in [0 ; 1]). Then, the white balance distorts the whole camera RGB range in a way that can produce even more non-colors.
The matter of gamut mapping at camera profile stage is you need to map things at the beginning of the pipeline too, in order to be sure that any RGB value you get before pushing pixel lies in the visible spectrum. Otherwise, good luck with non-white illuminants, blue LEDs and such.
This:
in combination with this:
Should give a good Idea why SSFs can be quite useful at times I guess. Mitigating IR contamination for example probably should be done at the stage Camera-space to connecting-space(XYZ mostly). And I would be very surprised if a CC24 or IT8 target lets you derive near IR responses for that. I think the torger links provide some reflectance spectra of those targetsâŚ
edit: yes it is in the section âchoosing test targetsâ. All but two patches have a flat broadband response in the near IR part of what is shown, that would make the IR-rejection calculation prone to large errors I think.
The biggest issue is with UV and the whole blue to purple region. IR are less of an issue. See here, for my camera (âadaptedâ is after white balance, the dots are the respective white points):
(image from Troy Sobotka)
âŚwhat does it look like if you donât do the âcleverâ white balancing and just use xyz multipliers? shouldnât that scale the colours away from zero and leave the triangle of primaries as it is?
This is the stupid white balancing which divides xyY data by the illuminant xy.
EDIT: this was the CIECAM02 adaptation. Stupid XYZ scaling gives this:
See here the full comparison:
Wow. Okay, Iâll have to think a bit about why the adaptations push the non-real primaries further away from the locus (and from the camera-primaries). I would have somehow naively expected the other way round.
Why is Troy Sobotka not on this forum again?
Super interesting. The Blackmagic pocket had a sensor without an IR filter stack which caused quite some problems (reddish blacks, color shifts). 390nm and below is for many glass-types a soft cutoff, so sensors typically donât need additional UV filters. Still very interesting. Thanks.
The geometric interpretation of a white balance is a 3D shear mapping.
Long story.
Yup, got it!
edit: a whole slew of whitebalance questions ensued in my head. But those are for other posts.
I demonstrated that to myself a while ago, and posted results here:
Based on a workflow described here:
Troy, for all his excellent knowledge, could not conduct himself like an adult. He found it to be too difficult to communicate in a respectful manner.
So not really a long story at all. He had many, many chances.
Civility is a requisite here.
He was nice one-on-one (when I PMed him); still helping the community in his own way. Anyway, I like this topic.
Hey @anon41087856, Iâm not quite following you on this issue. I confess I do not understand the examples well, it would be appreciated if you could simplify it for me. I am under the impression that normal camera sensors have filter stacks which filter out almost if not all UV and near-IR.
What is happening that is causing the massive shift in the primaries in your example?
Does RawTherapee do this kind of âstupid white balancingâ?
What will these colours look like then after needing to be clipped to the working space? Will they be clipped to pure saturation/black/white depending on what they were?
How does this affect normal photography? Is this related to the issue of the extreme difficulty of photographing certain super saturated, dark blue or violet flowers and having the photoâs colours match what we see?
Beware of terminology. In normal usage, âwhite balanceâ changes pixels without changing the primaries (it is a simple multiplication of RGB channels), but âchromatic adaptationâ does change the primaries. In this thread, when @anon41087856 writes âstupid white balancingâ I think he means âchromatic adaptation by XYZ scalingâ. But I could be wrong, of course.
See Chromatic adaptation - Wikipedia and Welcome to Bruce Lindbloom's Web Site
@samuelchia RTâs CIECAM02 module allows you to adapt. It can affect other modules such as Tonemap. Whatever @jdc has worked or is working on: e.g., newlocallab branch. ART may have this too (ask @agriggio).
Thanks @afre. I almost never use the CIECAM02 module for any of my editing and I have never encountered the extreme effects which @anon41087856 is showing. Or maybe itâs somehow escaped me without my knowing :S.
My understanding from my reading of Anders Torgerâs writings on camera profiling is that adaption does occur but again Iâm not sure I ever encountered such extreme distortion of colours relating to the cameraâs sensitivity to UV light. There is something Iâm clearly not understanding which I can hopefully learn about.
@snibgo Yeah, the terminology is confusing for me. I see that @anon41087856 has a history of being pedantic (I donât meant this in a negative way at all) about terminology, I just wanted to understand what he is saying better. I looked at some of the past discourse and I have found his writing difficult to understand, and I imagine it might be too for those who are missing one more more pieces of key knowledge. Obviously he is extremely knowledgeable and I would be happy to learn.
In digital, white balance is the dumbed GUI name for what chromatic adaptation does. Changing the pixels or changing the primaries is the same, at the end of the day, since itâs only vector algebra. White balance is CAT. In film, white balance was done with tuning directly the RGB filters, aka the primaries. Stupid white balancing is indeed XYZ scaling, aka the most simple yet fairly inaccurate way.
The massive shift happening is simply the consequence of a white balance/CAT. It depends of course what the original illuminant of the scene is, but the same kind of distortion happens everytime. If you have pixels in the deep blue-purple-UV region, they will look like fully saturated color blobs with no details or gradients. If you donât have pixels in that region, then you might not see it, but it still happens.
UV and IR filters donât entirely remove them. But the main problem is the 3 primaries, that are the solution of a linear mapping problem (color checker RGB â XYZ space) which doesnât care about bounds. The camera primaries, stored in the input matrice, only care about moving camera RGB accurately in the central part of the visible locus, not about staying in that visible locus, which would require the spectral goodness @ggbutcher is trying to unleash here.
⌠and I turn to the right to regard the Rube Goldberg contraption perched on the corner of my desk and say, âyouâd better clean up your act, buddyâŚâ
Actually, I have some data, so installment 2 is plausible within a week or soâŚ