Colorspace of a DSLR

We see for many displays, color space or gamut shown on chromaticity charts in comparison with some standard color spaces like Rec709, sRGB,Adobe RGB and ProphotoRGB which gives an indication of the Display devices color reproduction capability . similarly so for many popular printers also.

But not the same for a DSLR, like say the color gamut of a Nikon D300s DSLR ploted on a chromaticity chart in comparison with some standard color spaces like above ? why is this so?

I think that is because the first set you mention (Rec709, Rec2020, sRGB, AdobeRGB, ProphotoRGB) are all standard color space and most (non-expensive normal) monitors actually don’t cover the full space (especially Rec2020 and Prophoto) I personally also never saw actually printer color spaces (there are also a few CMYK standards floating around and did see plots of those)

So just as you are unlikely to see color space comparisons of actual monitors (the Samsung XXX vs the Dell YYY) or printer models (HP ZZZ vs Epson XYZ) you are unlikely to see those of actual cameras (or scanners for that matter). Not saying there aren’t any just that they will be hard to find.

(This is of course all IME and IMHO)

A quick search found an old article “understanding prophoto rgb” on luminous landscape that showed a plot of the canon eos 20d (released 2004) vs prophoto and others. It shows that camera colour space to be a close match to prophoto. I would imagine that all modern sensors match or exceed this, and so it becomes pointless to compare them.

In ProphotoRGB space two of the three primaries are imaginary (I think green and blue) primaries, that means we can not realize prophoto space in reality using real primaries including laser light sources,so can we think of the primaries of DSLR are also imaginary ? I am not convinced.

To clarify, when I said “match or exceed this” I meant match or exceed the 20d,

Green and Blue in prophoto are imaginary compared to human vision, but not to the real world. I don’t think Laser light has anything to do with it. An electronic device is certainly capable of detecting light outside of human vision, consider infrared photography, or why we use UV filters on ‘normal’ cameras.

You cannot define the gamut of a camera with a perfect triangle between three points like you can with displays, it’ll end up being something horseshoe shaped like actual human vision (which has overlapping color channels, remember!).

Additionally, I’m not actually sure that modern cameras have better gamuts; the emphasis in recent years is on improving noise performance in low light.

What do you mean “the real world”?

Colors are defined solely by how humans perceive them, so if there’s no way to actually stimulate the eye to see that color, it’s not real in any sense.

I am not defining the colour space of cameras as a perfect triangle. I am not defining them as anything. I am reporting that the article I quoted compared an actual plot of a 20d’s sensor against prophoto, and they were a close match. Not an exact match.

I agree the term “colour” is entirely about human perception. By real world, I mean electromagnetic waves which are not restricted by human vision. Green and Blue primaries in prophoto are considered imaginary because they lie outside the LAB horseshoe, which is an approximation of human vision, they are not imaginary in the real world because they can be measured by an electronic device, and probably by the eyes of a mantis shrimp.

I agree the emphasis of modern sensors is not about colour space. However it is very unlikely that the colour space would be smaller in a modern sensor, and it was already a close match to prophoto 12 years ago. Which comes back to the question in the OP about why we don’t see plots of camera sensors. Because they are so large, it becomes pointless to compare them.

See Elle Stone’s summary at the end of this article http://ninedegreesbelow.com/photography/srgb-versus-photographic-colors.html
“A considerable portion of the camera input profile color gamut contains entirely imaginary colors”

As a side note, these imaginary colours can be pulled back into non-imaginary colours in post production and can be used to bring out detail that we might not have seen with our naked eyes.

Even a pure wavelength will never stimulate only one color channel on a normal camera sensor. Thus, a camera gamut includes colors that can never be achieved, thus it’s also imaginary in the camera’s perception and not real in any sense. That’s what is meant by the nine degrees below zero quote you gave.

And usually, you’re not trying to pull unseen colors into the visible range unless you’re doing infrared or ultraviolet…

A student of @hanatos once measured the gamut of some cameras. I am not sure if I am allowed to provide the Bachelor thesis, but the gamuts were all over the place, from “nice triangle” to “something blew up and created modern art”.

Agreed.

I agree that a camera can capture imaginary colours, but not the reasoning. The fact that one wavelength triggers multiple channels, just means that the captured colour may be inaccurate, it doesn’t imply it’s imaginary.

Agreed, you’re usually not trying to pull in imaginary colours.