Wayland color management

Gnome has their own implementation of it, so that’s in no way any better.

I think the point was that precisely because it’s not available everywhere (i.e. part of a standardized protocol), bad “solutions” like gamma-control.xml pop up.

The ‘vcgt’ and hardware are not necessarily supposed to align, i.e. usually a ‘vcgt’ tag contains 3x256 16-bit unsigned integer values (other integer formats and counts are possible, but much less commonly used), which then (on systems where this is implemented) can be loaded via an API (the API usually doesn’t expose any internals). This may lead to quantization to a different bit depth (i.e. likely < 16 bit), which can introduce artifacts like banding on otherwise smooth gradients. The proper way a hardware solution would handle this is apply the high bitdepth LUT to its limited bit depth frame buffer using dithering (that’s the way AMD has been doing for over a decade). This could also be done in a shader if the hardware doesn’t otherwise support applying LUTs with dithering.

First one to show me that this is a real problem and not an imaginary one gets a crate of beer (or other beverage of their choice) :slight_smile: This could have been a problem back in the days when analog (e.g. VGA) connections were the norm, because changing the video card (edit: or even just the cable) could affect the signal (after D/A).