Ground truth in ICC transfer functions for display profiles (ATTN: Graeme)

I’m looking for implementation details on ICCs and their method of adapting transfer functions. I suspect Graeme is the most expert individual to answer this with conclusive information, but anyone else can feel free to chime in.

In particular, the question is the implementation of ICC based transfer function adaptation. That is, in the case of transfer function differences, how does the ICC specification deal with differing transfer functions?

  1. Is the essence of the transform within an ICC based CMS to take the encoded image buffer to display linear via an inversion of the transfer function in the profile, and then roll the results through the output transfer function? Specifically, in an ICC CMS, the ground truth is essentially display linear, or the inversion of the transfer function listed in the input profile?
  2. Given that default installations of Windows 10 and macOS have the canonized sRGB OETF present for the default display component of their installations, including the DCI-P3 ICM on macOS (as of last inspection), the only correct way to render a bitwise 1:1 with input image (as outlined above, mastered on an idealized 2.2 display outside of ICC based CMS) and display would be to tag / embed the image with an sRGB OETF version of the profile?
  3. Given a profiled display on either Windows 10 or macOS, where the results in 2018 essentially will deliver a pure power function profile at 2.2 on typical hardware, the sole method to receive a bitwise precise rendering of the image at the display would be to tag the input image with a pure 2.2 power function as transfer? The sRGB OETF scenario above would result in the image being display linearized via the inverted OETF, then nonlinearly encoded through the 2.2 profile.

Hi,

Assuming the image was mastered on a native sRGB 2.2 transfer function display without
ICC based management.

then I would assume that the image is tagged with a profile that reflects the creation display response.

Q1) Is the essence of the transform within an ICC based CMS to take the encoded image
buffer to display linear via an inversion of the transfer function in I.profile, and
then roll the results through the D.profile transfer function? IE: In an ICC CMS, the
ground truth is essentially the display linear inversion of the encoded input buffer?

This depends on the intent, but for Relative Colorimetric yes.

Q2) Given that default installations of Windows 10 and macOS have the canonized sRGB
OETF present for the default display component of their installations, including the
DCI-P3 ICM on macOS (as of last inspection), the only correct way to render a bitwise
1:1 with input image (as outlined above, mastered on an idealized 2.2 display outside
of ICC  based CMS) and display would be to tag / embed the image with an sRGB OETF
version of the profile? I: Input to display ends up a no-op.

Maybe. Ideally there should be a way for software to set the frame buffer contents without color management getting in the way (for calibration and profiling), and for Linux and MSWindows there is. For later versions of OS X, you have to use the “null profile trick” you outline above to be able to do this on all displays (i.e. primary and secondary displays). But if you are worried about getting native display behavior you also have to take care of the display per channel lookup tables (VideoLUTs) too.

[ And note that OETF and EOTF come loaded with other aspects - encoding and decoding may have different curves, to allow for viewing condition differences. ]

Q3) Given a *profiled* display on either Windows 10 or macOS, where the results in
2018 essentially will deliver a pure power function profile at 2.2 on typical
hardware, the sole method to receive

I’m not sure what you mean by that. On MSWin it’s up to the application to manage the color, the OS just provides the mechanisms. On OS X the OS manages the color, and you just get to set the source profile & intent. There is no pure power function in general as the display profile encodes whatever response the display has got - the most direct means of setting a color managed display color is as a device independent CIE/PCS etc. value, since this is the input to the display profile when driving the
display.

a bitwise precise version at the display would be
to tag the input image with a pure 2.2 power function as transfer? IE: The sRGB OETF
scenario above would result in the image being display linearized via the inverted
OETF, then nonlinearly adapted to the 2.2 profile.

See above. Either don’t apply color management (Linux, MSWindows), or use the null transform trick (OS X). For the latter the simplest thing is to retrieve the displays current profile and tag the pixel values you are setting with it. (This is what the current ArgyllCMS code does.) You certainly wouldn’t be making assumptions about the characteristics of the display profile.

Q4) Given the above logic is loosely accurate, has the whole purpose of the linear toe
to correct low level power function DAC inaccuracies been subverted via incorrect
tagging or vendor side implementation?

Sorry, I’m not sure what you are talking about. A displays response can be encoded in many different ways. Splitting into per channel transfer curves and some multi-dimensional representation is common, because it’s efficient. For an accurate profile those curves are what they are - they depend on the displays measured response.

Calibration curves (typically VideoLUT curves) serve a slightly different purpose. In the case of a color managed display they can serve the purpose of setting a white point (which ICC profiles don’t manage well due to the issues with display Absolute Colorimetric that have already been explored, as well as typical workflows being relative colorimetric), setting a brightness, setting a black point and/or improving the behavior of the display so that the profile can be more accurate without taking lots of measurements and needing a large complicated profile.
In the case of non-color managed applications, as well as setting a white point they set neutrality and transfer curves, allowing a measure of control over the color appearance of these non-managed applications.

Cheers, Graeme Gill.

1 Like

The problem is essentially the sRGB OETF and the main vendors choosing to tag their displays with the sRGB OETF. That is, it functions not as the native 2.2 power function hardware, but with the sRGB OETF toe response. This is as sRGB’s OETF was designed, but causes a nightmare in getting consistent response at the ICC / ICM level between default and profiled states.

For the sake of clarity, this will ignore minor hardware deviations from the 2.2 power function, and assume the hardware behaves more or less consistently at the hardware DAC level.

There are four design cases:

  1. Image designed on sRGB OETF default installation context. Image tagged with sRGB OETF. Encoded values 1:1.
  2. Image designed on sRGB OETF default installation context. Image tagged with 2.2 power transfer function. Encoded values not 1:1.
  3. Image designed on profiled context, native 2.2 hardware DAC response. Image tagged with 2.2 power transfer function. Encoded values 1:1.
  4. Image designed on profiled context, native 2.2 hardware DAC response. Image tagged with sRGB OETF. Encoded values not 1:1.

Depending on the state of the displayed system, the results vary quite a bit to the eight possible outcomes:

  1. System is default with sRGB OETF set for the display.
    1. Displays native 2.2, WYSIWYG during designing, sRGB’s OETF design negated.
    2. Displays with a compressed toe, not as seen in designing.
    3. Displays with a compressed toe, not as seen in designing.
    4. Displays with the “as designed sRGB OETF” lifted toe, but not as seen in designing.
  2. System is profiled to the native 2.2 power transfer function.
    1. Displays native 2.2, but not as seen in designing.
    2. Displays with a compressed toe, not as seen in designing.
    3. Displays native 2.2, WYSIWYG during designing.
      1.Displays with the “as designed sRGB OETF” lifted toe, but not as seen in designing.

I’m sure the above combinations probably have some muddling, which is some indication as to the labyrinthine combinatorial outputs.

It’s not a color managed system if the display profile is set to some arbitrary default that doesn’t reflect the actual measured display response.

All bets are off if it’s not color managed, so I wouldn’t expect any consistency between a default non-color managed state, and a color managed state (i.e. arbitrary display profile installed vs. actual measured display profile installed.)

I’m not sure what you are talking about. The video system DAC is linear - there is no hardware 2.2 power anywhere. Connected to a CRT, the CRT typically has a 2.2 power like response, but like any real world display, its response in detail is not a pure 2.2 power, and depends a lot on how its controls are set.
(More like 2.3 to 2.4 power in practice anyway - that’s the TV system viewing condition adjustment between bright studio with 2.2 power encoding, and dim CRT viewing environment.)

This is the essence with both Windows and macOS rallying around sRGB as a default EOTF.

There is a digital decoding function that takes the code values and converts them to display linear light. Hence the shorthand of DAC.

Reasonably sure that most higher end displays such as post house Dreamcolors are indeed quite close to a pure 2.2 power function on their decoder side.

That may be true, but it’s irrelevant in a close loop workflow such as Dreamworks would use. They create and review in a fixed viewing environment, and the final result is tweaked by eye in a calibrated viewing situation by the Colorist and the Director.

In contrast, the Video broadcast & distribution chain assumes an encoding of 2.2 gamma in a bright environment, and a decoding of 2.4 gamma (BT.1886) in a dim viewing environment.

I was specifically addressing the hardware level DAC that all of the workstations, not grading suites, would be based on. Last time someone reached out to me regarding their Dreamcolors, it was native 2.2 at the DAC level. I believe the sRGB emulation modes on the PremierColors and other also are pure 2.2 power functions.

The ICC chain is a bit of a mess though, given what you confirmed about the application of source to destination transfer function.

Am I correct in assuming that the intention of the sRGB OETF would be to encode to the sRGB OETF and display on a native 2.2 to maximize the linear toe region? I don’t see any other combination that would seem to provide any benefit.

I was specifically addressing the hardware level DAC that all of the workstations, not grading suites, would be based on. Last time someone reached out to me regarding their Dreamcolors, it was native 2.2 at the DAC level. I believe the sRGB emulation modes on the PremierColors and other also are pure 2.2 power functions.

Sorry, I don’t know what you are referring to. A DAC or D/A is a Digital to Analog Converter, necessary for VGA connection and equivalent, but typically not used with modern displays, which use digital connections. With rare exceptions, Digital to Analog Converter strive to be completely linear.

Perhaps you mean the VideoLUT hardware (RAMDAC in old parlance, where such tables would be part of the D/A hardware) ?

I am reasonably sure that the signal code values are digital and that voltage is analog? It would seem that several patents refer to DACs in the display, but perhaps they too are using incorrect terminology? (DACs implemented as ICs covered in the general term here.)

Another reference here for LCD and OLED displays reference the DACs as well as Liquid Crystal Display Drivers: Techniques and Circuits?

Is it that incorrect to use the term DAC given the body of literature that use it?

I am reasonably sure that the signal code values are digital and that voltage is analog? It would seem that several patents refer to DACs in the display , but perhaps they too are using incorrect terminology? (DACs implemented as ICs covered in the general term here.)

Right, but what is the relevance to color management ?

Modern displays will have all sort of implementation details - they may use a combination of lookup tables, timing, spatial, voltage and/or current controls to implement the digital input to final color response. (Plasma displays used most of these.) Some modulate the backlight in concert to the LCD pixel values etc.

But from a computer color management point of view I don’ t care, as long as the display response is consistent, and not too weird. Most displays will roughly emulate an CRT/sRGB type response, for compatibility and signal encoding efficiency. So if I’m interested in accurate color, I calibrate & profile the display, and move on.

Sorry, I thought I was quite clear.

Within the DAC region of typical sRGB displays, there is a power function conversion that takes the encoded digital values to the analog voltage output to deliver linear display light. That power function is 2.2.

It is relevant in the question if the goal of the sRGB OETF low toe linear region was to dig the code values out of the low level irregularities of circuits and what not, then that would only seem feasible if the image is mastered with the low linear toe region sent as-is to the 2.2 display. As well as of course, to deliver a WYSIWYG loop with the pixel pusher.

That is, if the CM software masters under the sRGB OETF and the display is listed with the sRGB EOTF (inverted OETF), then the signal ends up a no-op, defeating the purpose.

The only real method is as you suggested, which is that the display should be tagged with its display characteristic, which should indeed be a pure 2.2 power function, not the sRGB OETF as with modern Windows 10 and macOS defaults.

Is this a fair estimation?

Within the DAC region of typical sRGB displays, there is a power function conversion that takes the encoded digital values to the analog voltage output to deliver linear display light. That power function is 2.2.

Why do you think that ?

CRT displays had no power function at all - the power function is the natural behavior of the CRT tube. More modern technologies emulate a similar response, but the reality is that each type of display has its own native response. Few will be an exact 2.2 power function. (Yes, some high end displays will have selectable emulations of standard colorspaces, but that’s not their native response.)

Because that is literally every display I have ever encountered, as well as higher end displays that folks have forwarded the responses of. The DAC converts on a pure 2.2 curve, and even the medium quality consumer displays are remarkably close to that.

Not a single display that I have encountered with an sRGB emulation mode appears to use the inverse of the sRGB OETF, but rather a pure 2.2 power function.

The question remains, emulated or otherwise, if the default profiles included with Windows and macOS are somewhat self-defeating?