I just came across this news from Los Alamos National Laboratory: Research formalizes definitions essential to understanding color perception.
In short, a team of researchers has potentially solved a century-old problem started by Erwin Schrödinger. They’ve mathematically defined how we perceive hue, saturation, and lightness as intrinsic geometric properties, specifically correcting for things like the Bezold-Brücke effect (hue shifts at high intensity).
Most of our current tone-mappers, especially AgX, work to simulate a natural look by manually adjusting how colors desaturate and shift as they get brighter. If this research provides a mathematically “perfect” neutral axis and a “shortest path” for color perception, could it:
- Automate AgX Primaries? Instead of us tweaking sliders to avoid “the notorious six” color shifts, could the module use this new geometry to calculate the path to white automatically?
- Fix Gamut Mapping? Since the research proves color space is non-Riemannian, does this mean our current methods for handling out-of-gamut colors are fundamentally inefficient?
I’m curious to hear from the devs and color science enthusiasts here - could this lead to a more theoretically grounded version of AgX or a brand new scene-referred module?