Wayland color management

Color management support is slowly moving forward in weston and once it is somewhat usable it’s up to clients to make use of it and give feedback.
I’ve basically changed my mind about color measurement and think that we need another wayland protocol for it (@dutch_wolf @fhoech we could try again to come up with something).
In regards to HDR there is nothing of substance yet. The only thing I managed to do is convince myself that all proposals out there are not great and that there is lots of information that’s just not publicly available. To move this forward I either need HDR capable hardware and do some tests myself (which is unlikely to happen with my current financial situation) or get the help of people who do have access to more information (planning on hitting up a few people at amd, intel and nvidia).

3 Likes

brightness/contrast adjustments (compensating for Stevens & Bartleson–Breneman effects) need to know the surround / display luminance ratio, and as such need to be fully separated to the white luminance scaling,

Isn’t that also true when you control the “analog” backlight? You just change the luminance until the brightness is in the right ballpark.

artificially limiting the peak luminance through the OETF means you will loose at least half of your encoding bits bandwidth

This would not limit the values you send to the monitor, it would rather change the meaning of them. While in a bright room a value of 1 might be reference white while in a dark room 0.7 is that same reference white and the values greater than can be used for highlights or if your source material only encodes up to reference white it would just clip there, essentially limiting the luminance.

Does that make more sense?

(btw, I enjoyed your article about the filmic hdr mapper)

I think you’re being a bit too harsh here. They’re using wayland to support android apps on chromium os and they’re basically matching the android API which simply assumes the display is in one of the specified color spaces. It’s not that bad of an assumption when you only deal with applications to consume content.

Seeing how hard it’s going to be to properly support this I have sympathy.

I don’t think I am overly harsh, if it was indeed what you say it is I would be happy to propose renaming it to “color management light” or some such but it is very, very flawed. To begin with it should be impossible to set primaries (+ white point) separately from the matrix those are depended variables secondly the only feedback (especially considering the above point) is via wl_display::error which is fatal so it is impossible for an application to figure out which color spaces are actually supported without dying (note that there is no event telling the application what the compositor supports or what the advisable color space is).
The above means that this protocol is fundamentally broken and I would say unless the compositor implements all combination (yes even the nonsensical ones like rec.2020 primaries with sRGB conversion matrix) the protocol can’t be fully implemented.

1 Like

Not disagreeing that the protocol is fundamentally broken as a general purpose cm protocol but they’re limiting the scope to their own system with the intention to eventually use whatever gets standardized. Really nothing to worry about.

I would say it is even broken as an application specific protocol, separating out matrix vs. primaries still doesn’t make sense and the final result is an overly fragile protocol[1]. Also I didn’t read the original mail as “this is application specific” but as “here is something quick we have for implementing color management” and if more people read it as the second one the risk exists that people implement this protocol instead of an actual good one, making it harder to transition to an actual good one.


[1] A more useful protocol would have a single enum for the color spaces an event that tells the valid color space and a request to set a color space for the surface, this is a bit more maintainable since you only have to worry about keeping a single enum in line instead of 3 (2 of which convey the same information and the third being redundant as well if you keep with standard color spaces)

1 Like

Don’t worry. There is a standardization process in place for wayland-protocols now and as long as the protocol is not proposed to get in there I don’t see the point in spending any more time on this.

What is the status here?

This would work for OLEDs i guess, but not for regular panels with backlight.

1 Like

lots of work to do. upstreaming the color management branch in weston, figuring out how to port toolkits (gtk, qt), figuring out the protocols for hdr and color measurement, …

if anyone wants to help I can give some pointers but I’m doing this in my spare time when I want to

It would work for all panels but it would indeed work better on self-emissive panels.

1 Like

But given you still use 8 bits (or 10 bits if lucky) integers, it means that you use 178 code values to encode an image from black to white, instead of 255 (which is not much already). I don’t think it’s a good idea, for the sake of smooth and evenly-spaced gradients.

You are mixing different concepts here. Luminance is a physical absolute measure in Cd/m² (or a relative measure expressed in percent of the medium white luminance, but that’s just another way to write down the same concept).

Brightness is psychophysical product of the contrast and mid-tones in an image viewed in a certain surround.

The hardware backlighting controls the luminance and is scaling uniformly the luminance of the image (simple multiplication by a constant). To compensate for Stevens/Bartelson-Breneman effect and such, you need to apply a “gamma-like” (exponent) transfer function to raise the mid-tones more or less, aka rescaling non-uniformly the luminance.

So, that brightness transform, which keeps black and white roughly where they are, but raises mid-ones more or less depending on surround, belongs to the CMS and software. But the luminance adjustment, that scales the whole range by moving the physical amount of light up or down, belongs to the hardware, and I don’t think it’s a good idea to correct it in a discretized integer color space, because of quantization issues.

As far as I can tell an adjustment such as @swick is proposing here is mostly usefull for mixed SDR and HDR content in which the HDR content still uses the full 10bit (or more, AFAIK there are no HDR screens with less then 10bits), but the limit is placed on what the max luminance would be of the SDR content that is mixed in. From the documentation Apple provides MacOS X will be doing something quite similar (see here under “Understanding EDR pixel value” ), in this case SDR content will use the values in the domain <0.0,1.0> while HDR content will use the domain <0.0,MAX_EDR>

Note that in the above such a limiter is only for SDR content while the monitor is in a HDR mode and thus preferably also displaying HDR data at the same time.

ok, in that case, it makes sense.

At this point I’m very tempted to just use Apple’s EDR concept and see how far it gets us. For non-HDR displays we would just have EDR=1.0 and let the “brightness” setting adjust the backlight while for HDR displays the brightness setting just shifts EDR up or down. Maybe it turns out we can use the EDR shifting for brightness control on non-HDR displays when we add dithering… might be worth testing.

Wouldn’t that destroy contrast ratio at non-maximum brightness? E.g. if a monitor has a decent contrast of 1500:1 at its maximum brightness of 300 cd/m², the contrast would become 400:1 at the 80 cd/m² of sRGB. Not to mention the needlessly high power consumption.

1 Like

The whole point here is that you could use the headroom for HDR highlights.

I’m not sure if I understand. When you double the luminance of a display via the backlight control does it really result in both min and max luminance to double? Is the same true for self-emissive displays? I would assume that it’s not the case for self-emissive (just like limiting the values instead of scaling with a backlight). If it’s not uniformly scaling there would indeed be a change in the contrast but I wonder if it’s noticeable and how/if self-emissive displays and the EDR system from apple handle that.

That’s in my opinion the most convincing argument against Apple’s approach, unless the monitor has really a good contrast. Banding is IMHO not an issue, because the display-encoded values get scaled less than the output luminance values (due to the power-like display transfer function).

However, I have also been thinking about the above idea for what I would call “poor-man’s approach for pseudo-HDR displays”.

First of all, by “pseudo-HDR display” I mean a display that has the typical sRGB-like response curve and gamut, but has a maximum brightness that significantly exceeds the 100 nits of a “typical” SDR display. For example, my laptop has a display that more or less covers the sRGB gamut and has the sRGB transfer function, but has a maximum luminance of almost 400 nits.

In such a case, my “poor-mans” approach would be to start from a rendering of the final image in linear RGB at 32-bit floating-point precision, scale the mid-gray value to 18 nits / MAX nits (18/400 in the case of my laptop), then apply the display’s ICC profile, and finally convert to the display’s bit depth. In such a case, “diffuse white” defined as ~5x mid-gray would always end up at 100 nits.

Does it make sense?

2 Likes

For backlit (non-self-emissive) displays, definitely. You can see it in tftcentral’s “contrast stability” tests for example: Dell S2716DG Review - TFTCentral

If you kept that monitor at maximum brightness, its black point would be at 0.36 cd/m², giving you a very mediocre contrast ratio of 220:1 if you set its white luminance to 80 cd/m² by software limiting. Whereas if you did that by controlling the physical backlight, you would bring the black point down as well, and you would maintain the ~900:1 contrast ratio that the monitor appears to have throughout its brightness range.

In addition, it would consume 56 W, twice what it consumes when its backlight is set to 120 cd/m². Speaking of the backlight, it would most probably wear out faster, too.

1 Like