Calibration in Windows 11 HDR mode using DisplayCal/Argyll

@gwgill @fhoech I’ve been using DisplayCal to create monitor profiles for Windows 11 in HDR mode on my OLED laptop. The issue I’m having is that the video card LUT that’s created, while targeting sRGB at D6500, is coming out at 7000+ CCT when verifying the profile and when reporting on the calibrated display.

I’ve worked around this by reducing the difference between each default LUT value and the calibrated LUT value by a factor of 4. So if the default LUT value for red at index 255 is 65535, and the calibrated value is 65531, the “corrected” value would be 65534; and so on for each LUT table index. I then incorporate this corrected LUT into a new profile.

While this works reasonably well, it is a clunky work-around and results in calibrations that are of lower quality. I was wondering if you guys have any thoughts on what’s happening? During refinement passes, does Argyll actually measure the results of applying the LUT values so it knows how much to increase or decrease them by?

I’m not sure if this is a bug in W11 HDR mode, or if its an artifact of how HDR works. My theory goes something like, “the LUT table is interpreted in HDR space not in SDR space”, but I’m not really sure…

Found this have not had time to read it. How to Calibrate HDR on windows 10?, PG35VQ | DisplayCAL

So I should clarify that I’m calibrating for SDR output in W11 HDR mode. In W11 HDR mode, any output from non-HDR aware apps is mapped to SDR. HDR-aware apps like Chrome can display windowed HDR content alongside SDR windows, for instance in a YouTube window. It’s pretty cool actually.

I want to leave my laptop in HDR mode but the white point is too red and calibration is in order. However I’m running into the issue described above where the resultant LUT produces exaggerated results.

Sounds like calibration and profiling is happening in non-HDR mode, and then the calibration LUTs are being applied in HDR mode. This type of problem is pretty typical when such special modes are introduced, rather than it being able to slot into an existing mechanism. (Apple have apparently gone even further in the way they’ve bodged their HDR mode, effectively disabling all the existing calibration and profiling abilities in HDR mode in the process.)

Best bet is to figure out if this is indeed what’s happening, and then see if there’s any way of forcing HDR mode during calibration.

Argyll’s measurement always occur through the per channel LUTs - there’s no way of turning that hardware off, but of course it sets the LUTs to unity while determining which sequence of LUT value results in the desired white and brightness. (Actually it’s more subtle than that since typically the LUTs have a higher resolution output than input).

[ Without having a Windows 11 machine and HDR display available to me, I can’t do much more than speculate I’m afraid. ]

When Windows 11 HDR mode is enabled, there are three modes an app can run in:

  1. SDR mode. The app is unaware of HDR and its output is “clamped” to sRGB and is displayed as it would be on an sRGB monitor. Most apps (eg Firefox) run in this mode, and the Windows desktop and windowing system run in this mode.
  2. HDR mode. The app is aware of HDR and can “see” the wide gamut display, and can display wide gamut and HDR content. If wide gamut content from the app has an ICC source profile, it is mapped (by Windows) to the display using EDID info from the monitor. Without an ICC profile, sRGB is assumed and mapped to the display. Chrome runs in HDR mode.
  3. Legacy display ICC color management mode. The app can “see” the wide gamut display, and can map colors from its content to the display profile using ICC color management. This mode is for apps like Photoshop that want to do their own color management. The monitor profile is auto created by Windows from monitor EDID info and passed to the app when it asks Windows for the current profile [1]. This mode is set on the properties tab of the executable that wants to run in the mode:

Under Windows 11 HDR mode, I’ve done DisplayCal/Argyll calibration in “SDR mode” (which Argyll runs in by default) and in “Legacy display ICC color management mode” (for this I set the “legacy ICC” toggle on dispread.exe, dispcal.exe, and dispwin.exe). In SDR mode, a profile with sRGB gamut is created. In Legacy ICC mode, a wide gamut profile is created. Just as expected.

In both cases the LUT produced is very similar to what I get when calibrating with Windows in non-HDR mode. However, when applying the LUT with Windows in HDR mode, the CCT is way off D6500. What seems to be happening is that the LUT created by Argyll doesn’t reflect its true effect on the display in Windows HDR mode. This is why I asked if Argyll reads back actual measured values while applying the LUT during refinement passes, or if it is simply assuming that the LUT is functioning as intended and doing calculations. Sorry if that’s a dumb question.

I have a hunch that calibrating while the screen is at max brightness might help, since in Windows HDR mode the max brightness is available for HDR highlights despite the user’s nominal setting of it. I’ll give that a try and report back.

[1] Currently Windows 11 has a bug where the auto-created monitor profile has a bad red Z tristimulus value, causing images to look yellow. This is supposed to be fixed in “late January”. This profile can be replaced by a “good” profile just by copying over the profile created by Windows in the color profile directory.

Nice topic to read.

I want to calibrate my monitor (not HDR), but I have Fedora 35, I installed DisplayCal and hope to have for some days a x-spider calibrator. I don’t know if this is going to work but I’ll try. Any advice be welcome.
Samsung UHD UE590D

My advice is to carefully read the whole page here before starting:

Beware, many HDR displays can’t display full screen full brightness, and a much smaller
test window against black (or balanced screen total brightness) is required to get sane
results.

1 Like

I think the update from this week fixed this.

Thanks, yes it did :slight_smile: and it also fixed the bug where the system default profile was being returned to apps instead of the device default.

Well that did not work… even worse results in full brightness mode with black background:

@fhoech In Windows 11 HDR mode, DisplayCal Profile Loader doesn’t load the current HDR profile, it will keep loading the Windows SDR mode profile. In this screenshot, it should load the green shaded profile, but only loads the yellow shaded profile:

I don’t understand this. If the LUTs are being applied in an HDR color space, which is possible because its happening on the hardware monitor which is in an HDR mode, why doesn’t dispcal see during refinement that the calibration is off, and adjust the LUTs accordingly? It seems that feedback loop should work to adjust the LUTs, taking into account HDR mode, although resolution would be reduced.

Sorry, pixls isn’t a good forum for me to respond to at the moment -

once again they have changed the web interface so that it only works
with the “latest and greatest”, unhelpfully suggesting that I should
“upgrade my browser” (not possible - there are no newer browsers
available for my system).