I suspect that most here are already aware of this, but it’s a good, easy to understand, explanation.
It’s also a good virtual “bludgeon” if someone starts blathering about color science and camera “looks”
I suspect that most here are already aware of this, but it’s a good, easy to understand, explanation.
It’s also a good virtual “bludgeon” if someone starts blathering about color science and camera “looks”
(Aside; ever wonder why there’s an underscore character in front of some raw files? If it’s there, it indicates that the preview image is in the 1998 Adobe RGB color space; otherwise it’s in sRGB.)
I had no idea… the more you know. Thanks for sharing.
Edit: Those color blind deficient stats are crazy. I had no idea 8% of men were color deficient.
Good link and nice to see a Foveon mention!
Here’s an image using raw unconverted data from one of mine:
None of that pixelated Bayer CFA stuff …
I am not sure I would be convinced by this post if I did not know these things already. The article is very thin on actual details, contains no image comparisons, and just glosses over key parts.
Also, don’t know why he mentions hot mirrors / IR filters at all. AFAIK most of them cut off the spectrum quite sharply around 700nm. For practical purposes, they are as “perfect” as they need to be.
From the spectral responses I have seen, most camera dyes appear to be pretty decent approximations for S and M (“blue”, “green”), but the L response is not matched very well, there is an abrupt cutoff for “red” somewhere between 550nm and 600nm in camera sensors. But this is not problematic and can be corrected using a linear model in most cases.
In practice almost the same “neutral” image can be recovered from RAW files. And from then on, it is indeed just processing.
There’s a saying in academic writing, that out of a panel of three (male) reviewers, you’ve got a one-in-four chance that at least one of them has color vision deficiencies. So you better make sure not to graph results in red vs green, and always differentiate things by more than just hue.
(IIRC I read this first in Kovesi’s beautiful paper Good Colour Maps: How to Design Them)
From 1974-1990 I worked for a publisher of pictures, mostly greeting cards. It was part of my work (until I sidestepped into teach-yourself computer systems management) to order colour separation (for litho printing) and to accept/reject/comment on the proofs.
I think I was once quite good at colour matching and assessment. Maybe I was always wrong about that or maybe these things just change with times. But I was aware that, over the years, I was finding it harder, and feedback from respected colleagues said I was getting worse, rather than better at it. And I hate those graphs with similar-hue lines!
Now I am a keen photographer, working with darktable and editing raws. I try very hard to get people’s skin tones something like right. But I don’t even have “the original” by my side when I am doing it. I try to eliminate the peculiar purple or orange tones that weird light can do to dark skin [as seen by camera]. I absolutely know that I do not have the colour memory to get their clothes “right.”
But hey, I am not running a portrait studio. I’m photographing musicians on stage. I aim for “realishness” and a nice picture, not realism. And I don’t get many complaints (and also-hey, the pics are freely given anyway).
Back in my publishing days I learned how many people think they can remember and judge colour accuracy. And no, if you are not looking at the original, under the same light, mostly we cannot.
Kasson’s main point is that the raw converter is more responsible than sensors for color accuracy.
There’s a similar misconception in the world of Foveon when they introduced a new sensor model the F20 “Merrill”. At first, folks marveled about the detail and/or microcontrast but that slowly morphed into “too sharp”, “halos”, na-ni-na, with the complaints mainly being about “the sensor”.
Now it is generally acknowledged that problem lies with the proprietary raw converter Sigma Photo Pro V.5, which for some reason had much more sharpening than previous versions. I have tested that sensor’s raw output edge spread response and, like most sensors, it’s slightly soft.
As to color, the Foveon needs far more color correction than Bayer CFA sensors and probably X-trans, so Kasson’s POV is certainly true for Foveon sensors.