Evaluate hue shifts the scientific way

Most real world photos have been converted (by the camera and it’s input processing) into colorimetric images, corresponding to how humans perceive them.
(This is implied by the specification or assumption of the colorspace they are encoded in.)

Incorrect. Most 3D renderers use simplified “RGB” color calculations, that don’t at all correspond to the interactions of light in the real world. This is for speed. There are exceptions in which true spectral rendering is used, but there are also such things as spectral cameras or colorimetric cameras that can more accurately capture real world color.

This is a poor way to do it, since Yxy is a very perceptually non-uniform space.
At least use L*a*b* or L*C*h*.

But if you went to the trouble of investigating, you would find there is extensive literature on approaches to quantifying perceptual image differences - such techniques have been widely used in the field of image compression amongst others.

Answer: when they hit your retina. If what you are concerned with is imagery taken for human purposes, then that’s the only thing that counts in the end.

Your respect for others expertise, experience and opinions is simply breathtaking. But is that likely to make others respect you ?

2 Likes

Thanks for adding the detail, I was far too brief :slight_smile:

That’s what I was questioning - are we trying to measure something perceptual or not? The aim here appears to try to avoid it.

disclaimer: what I’ve read so far comes from “Measuring Colour” (it’s actually quite a to-the-point book), which I was lead to by a pdf here I think linked by troy s. I know very little of this subject…

The question is why is @anon41087856 trying to avoid using measures of perceived hue shifts?

What makes @anon41087856 think a polar transform of xyY is a good color space for measuring hue shifts? What makes him think it’s a better color space than LCh?

@anon41087856 is getting nice low numbers using his metric. But what do those numbers actually mean?

2 Likes

@anon41087856 seems very confused about what he’s doing.

There is light, quantified by Radiometry and wavelength, and there is perceived brightness and color, quantified by Photometry. Both can be quantified objectively, the former using physics, the latter using Color Science.

There is no halfway house though. As soon as you are dealing with non scientific cameras producing RGB images, or calculating XYZ values and anything derived from them, or talking of hue and saturation, you are implicitly in the world of Color Science, and dealing with perceived quantities. So talking about something based on Photometry and Color Science but not perceptual, is a nonsense. The best you can do is be objective, basing things on well established and quantified understandings of how humans perceive light and color.

Since (as far as I can tell!) everything on pixls.us is concerned with imagery meant for human consumption, non-perceptual measures that would be useful in other fields such as hype-spectral imaging for instance, aren’t going to be of much interest here.

3 Likes