Wayland color management

I think the best bet to figure this all out would be to have a HDR screen on hand (preferably an Freesync2 one) to see what we can do with it. Although since it will be tech dependent we probably need more then 1 HDR screen to try out all the possibilities.

All in all I think it would be indeed best to focus for now on proper full compositor color management since about 80~90% of needed infrastructure for HDR support will come with it.

1 Like

Hi,

so, can someone explain in 1 or 2 scenteces why Wayland does not support color management? Or why it is difficult to implement color management in Wayland?

Thanks in advance

b

My understanding: because they don’t want to provide an API for the profiling tool to reliably show a colored rectangle in the middle of the screen, an API to apply calibration curves globally, or an API to know on what monitor your application window is being displayed.

1 Like

The difficulty mostly lies in that the original core design doesn’t take into account any color management, so it has to be an add on protocol. For the core application protocol we have something that most people agree on but that is only half of what is needed for proper color management since a profiling/calibration protocol is also required, the problem here is that in the wayland world (at least the core parts, compositors are free to do their own thing to certain extends) there is a big no on giving applications any kind of direct HW access and that includes the calibration curves. There is currently a big disagreement on how to handle this and to be honest have been a bit burned out by the discussions (since those don’t seem to go anywhere)

4 Likes

ok. thanks.
so, I am trying to sum up: at least one application protocol is missing (whatever that is)

The Linux Kernel 5.3 will have HDR support for some Intel chips:
https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/commit/?id=417f2544f48c19f5958790658c4aa30b0986647f

ToDo:

  • Wayland support for HDR
  • Toolkit support for HDR
  • Application support for HDR

Any documentation on how the DRM HDR bits work? (if I ever get myself an HDR screen might be interesting to play with the HDR modes from the console directly to try stuff out regarding measuring/profiling)

Anyway this reminds me I did sometimes back found some documentation how apple is planning to do HDR it can be found here: https://developer.apple.com/documentation/metal/presentation_objects/displaying_hdr_content_in_a_metal_layer

Some observations:

  • There are three options
    1. Mastered content (already in rec2020-{pq,hlg}
    2. Non mastered content but want to use systems tonemapping
    3. Non mastered content but want to do own tonemapping (e.g. when speed is important or when using a reference display)
  • its heavily tied to the build in color manager of MacOS
  • option 2 and 3 are only available when using Metal, option 1 can also be done with AVFoundation (I think)

I think that a wayland protocol should at least provide option 1 and option 3 with the caveat that option 3 should only be available for when HW is capable of doing so (e.g. when it provides the FREESYNC2 HDR display modes). Option 2 would be nice to have but can also be provided by toolkits and/or a Vulkan extension in which case it can be either mapped to option 1 or 3 depending on HW capabilities (and maybe user choice). Above all else it seems clear it needs to be tied into any color management protocol (can either be a version 2 or a protocol that builds on top of the core color management protocol)

I think you need to contact Uma Shankar <uma.shankar-AT-intel.com> from Intel to get more information.

Glancing over the above commit, it’s clear that what is meant by “HDR support”, somewhat vaguely, is the ability to pass HDR10 metadata (infoframe) to a connected HDR10-capable display, so that it may switch to its HDR10 mode.

So pretty basic then, well it is a start, hopefully they implemented in a future proof way, since although currently most screens that support HDR are consumer focused I do expect the productivity focused screens do begin dropping in price as well.[1] And although those screens are not totally useless with only HDR10 support it would be much better if those could be driven directly.

Now of course this is with my rather limited understanding of HDR10 infoframes maybe those are already flexible enough to do the above?


[1] For example Apple’s new productivity focused HDR screens seems expensive at 5k but as a screen that can be used as a reference display it is competing with screens that go for 10k to 20k. Non-reference displays for productivity work should become cheaper as well

Note that I was only commenting on the kernel commit, the Apple implementation looks to go beyond that (mixing of “SDR” and HDR on supported displays), it will be interesting to see how sophisticated their approach really is.

If I am right the mixing happens on the MacOS side of things (inside the build in color manager) and the display will “always” be in HDR mode. I suspect this due to how EDR value behaves as described here for normal tonemapping and here for some notes on reference displays and that it is required to set (some) of the color management bits. Now of course this is only the software side of things (and even then only the parts visible to application developers) how it exactly is implemented is currently unknown.

Yep, that much seems to be clear. It will be interesting to see how they handle a HDR10 display’s own (potentially and likely undefeatable) tone mapping in HDR10 mode, maybe they’ll rely on the display to be intelligent enough to not do any of its own tone mapping if the HDR metadata doesn’t exceed the display peak luminance capabilities, but who knows.

Probably has to, although they may (have to?) limit it to those that can have HDR (or rather, high luminance) on/off on a per-pixel basis?

My suspicion is a special HDR mode that isn’t HDR10 similar to what AMD seems to be doing with Freesync2HDR it might be in fact be the same technology! Apple is known to only support AMD officially so this wouldn’t be a surprising thing to be honest especially since the documentation mentions one reason for doing your own tone-mapping is latency which is the same reason AMD has for introducing Freesync2HDR. Of course your option would is also be valid although probably only implemented for screens that don’t support the special display modes (since those screens still have the latency issue).

Quite likely that is why I put ‘always’ in quotes since there probably is some smart thing going on in the background.

1 Like

Just FYI I have a working prototype of the color management in weston and the plan is to evaluate DRM leasing for calibration/profiling but that also requires implementing a WIP wayland leasing protocol and some more protocol design around how to handle input for leased out desktop outputs.

1 Like

This one is interesting. I thought about making the user’s brightness setting just change the tone curve instead of dimming the backlight. It has the drawback that it the power draw will be the same. Their solution seems to be a hybrid where under a certain threshold it backlight is getting dimmed.

That is good new. Do have some question regarding using DRM leasing for calibration/profiling

  1. We need to be able to lease the primary screen evem on a single screen system and on a multi screen system it would be preferable to not see it as disconnecting the monitor (as current experiments in compositors do) at least in this case in other cases it might be preferable. I think this is possible but don’t think the current lease protocol allows for this possibility.
  2. Are all the DRM/DRI/KMS interfaces the same with regards to setting the calibration/gamma curve (we don’t want to have to implement a different way of doing things for every graphics card) and can we take over the screen withouth needing to redo modesetting?
  3. This would eliminate the compositor from the drawing path completely and although I don’t think it would be a huge problem I do worry that if a compositor decides not to use the gamma/calibration curves but for example a shader this might change the output which would invalidate the profile, I do think this is unlikely but until further testing can’t be completely ruled out.

Either method would work I think to a certain extend this could be implementation specific

Not sure if I understand you. The video doesn’t show desktop outputs getting leased. Sway doesn’t support that at all right now.

What the compositor does when the DRM resources is leased out is up to the compositor. It could act like the output got disconnected or pretend that it’s still there but you can’t see it temporarily.

The interfaces are common between different hardware, yes.

I think so, yes. You can specify in the atomic commit if you want to allow modesetting to happen or not (but I also don’t see why this would be a problem anyway).

If it applies the calibration curve incorrectly then it’s broken. If it applies it with too little precision, it’s broken. If it applies it with a higher precision nothing should change.

Obviously you’re right, we have to verify that everything works as we expect it to work and that’s why I want to do this.

Second paragraph under " wlroots & sway implementation" header

So what would normally happen to a compositor if we disconnect the last screen? Which in the above scenario could happen on single screen setups that want to calibrate/profile, currently I suspect that many a compositor would crash or exit. So it might be acceptable for two plus display setups (not sure what the best user experience would be, disconnecting will move all applications to the other screen(s) but on the other hand probably won’t restore everything when the screen comes back) but it probably won’t be for single display setups. So in the case of single display setups the protocol will need to specify what will need to happen (even if you disconnect since you want everything else keep running so you need some backbuffer/virtual output in that case anyway).

Does that make more sense?

Okay that is good! Had heard that not everything is the same with regard to different HW, although I think that most has to do with the 3D/compute engines.

It is a bit more of a nice to have, that way a calibrator/profile can just focus on the calibration/profiling without also needing to modeset.

You are right, we need to keep an eye on this but if we can make it work it would be acceptable (I think).

Mh, for our use case that’s not the best behavior at all.

The resource is only gone temporarily, you can just stop presenting to it but otherwise pretend it’s still there. It would seem very much like fullscreen.

Maybe. Not sure yet.

Yeah, I got it.

Well, the hardware itself can and will be different, the interface to the hardware will be the same. Hopefully all hardware uses enough precision but it’s one more reason why we should have access to all the hardware: verify that the per plane and the pipe color pipelines don’t screw up.

Btw, thanks for writing the mail to the wayland-devel list. It reminded me that I have to get back on that.

1 Like