"Aurelien said : basecurve is bad"

Based on the part you quoted, perhaps. But do take into account the phrase just before the part you quoted:

darktable says otherwise (tooltips over the modules)? Iirc, filmic includes a log transform…
And of course, all three of those force their output to the range 0…1

2 Likes

I don’t think that the numbers that are used for display correspond to anything that can be understood as a linear representation.

@Entropy512 is correct here, both base curve and filmic output linear display-referred tristimulus data.

Both scene and display referred can be represented in linear terms in regards to emission strength, i.e. double the light, double the value. What really puts them apart is if the data is bounded (finite) or unbounded (infinite).

So yes the tooltip in the base-curve module seems to be incorrect because its output is linear and bounded.

1 Like

I don’t get this. The base curve and filmic certainly don’t look like they’d obey the ‘double the light, double the value’ rule.

1 Like

That is correct, the operation is non-linear but the representation of the output is linear.

I.e. double the emission from the display is double the value.
A counterexample, sRGB is a non-linear representation of the display emission to better utilize the limited 8-bit quantization. That data can however easily be linearized while still being in a display-referred space.

A fun example of linear 8-bit display-referred data is addressable LED strips like ws2812. They have horrible “emission resolution” in the shadows and you barely see a change in brightness for the top 25% of emission strength!

1 Like

OK, thanks for the clarification. I thought, for some reason, that by light you meant input (captured by the camera) rather than output (emitted by the display) light.

1 Like

Yup. The history actually goes back farther - other than that linear bit at the bottom end intended to make some mathematical operations play nicer, the gamma was chosen to be a reasonably close match to the fundamentally nonlinear behaviors of CRT displays.

Yup. Some WS2812 controllers will do temporal dithering to improve the depth. Also, linear vs nonlinear dimming is sometimes featured in reviews of dimmable LED bulbs, ones with linear dimming kinda suck.

Ages ago I implemented a sigma-delta modulator as a wrapper around Atmel’s software PWM. Their softPWM had massive performance impacts if you went past 8 bits on an AVR (since it was an 8-bit microcontroller), so the time-critical loops ran 8-bit PWM, and a less time-critical sigma-delta was wrapped around that, along with a lookup table so that I2C traffic could be 8-bit nonlinear representation: GitHub - Entropy512/I2C_RGB: I2C Controllable RGB LED driver in an Atmel ATTinyX5 (Edit: In fact, I2C_RGB/firmware/gammacurve_8to12.h at master · Entropy512/I2C_RGB · GitHub gives me an example of why sRGB had that linear bit at the bottom, since many input values mapped to 0/4095 or 1/4095 )

I abandoned that project when much more cost-effective off-the-shelf solutions (such as the WS2812) showed up, even if the project did have some technical advantages. Atmel’s USI had some reliability issues - turns out that their reference I2C implementation violated the I2C standard, it actively drove the bus high, which would lead to a round of Bus Fighter if the master tried to clock stretch. :frowning:

1 Like

Hmm… funny.

From https://pixls.us/articles/darktable-3-rgb-or-lab-which-modules-help/:

In reference to the scene, yes. In reference to the display - no - it’s linear. Nonlinearities in representation/encoding are not performed until the “Output Color Profile” module at the very end.

OK, I get it.*
Makes the term “linear” more than a bit of a moving target :confused:

1 Like

A lot of terms can wind up being a moving target due to corner cases.

For example, the HLG standard claims to be “scene referred” - but is bounded! Due to that bounding, I consider it closer to display-referred than the standard’s attempts to claim otherwise.

Well, mathematically, it’s definitely related to a linear relationship of the magnitudes to the phenomenon they represent. Regarding light, twice the energy = 2x the number, no? That’s why the precedent adjective, e.g., “scene-linear”, is important.

1 Like

I can’t agree more. Unfortunately, there was a silent switch from scene-referred linearity to screen-referred linearity.

Guess I’ll leave this discussion as well, not interested in shifting definitions.

2 Likes

I’m probably guilty of that as well in some posts, assuming “linear” is solely related to the light at the scene. I am now sensitized to that, thanks…

I think that these discussions would benefit from a definition of “linear” — even if we disagree, it would help clarify the differences.

I consider a representation linear if I would find it meaningful to perform linear operations on it (eg a + b, \alpha \cdot a, A a for pixels a,b, scalar \alpha, matrix A). This is not something I would want to do with the output or filmic rgb or similar (artifacts), so I don’t find it helpful to think of those as linear.

My understanding is this: in a linear representation, if a value x represents an amount of light L, then Nx represents an amount of light NL. Here, both N and x are reals larger than 0.

1 Like

Yup, and that can be either scene light, OR display light.

Yes, in some cases you have an intermediate representation that is altered further, but here, the question in the case of display-linear:
If you took the data, as it was, and sent it to a display that accepted linear data, or assumed it was linear and applied an inverse EOTF for a particular display that wanted nonlinear data, would it be at least some semblance of correct, or would it be significantly altered because its meaning in terms of display light was misinterpreted? (See, for example, what happens if you misinterpret linear data as sRGB for example - doing this leads to the misconception that “linear looks dark” when it only looks dark if you mistakenly assume it has a gamma curve applied.)

My understanding is that a data representation / display mismatch will result in incorrect distribution of light intensities, including clipping. I’m not sure how strong the effect would be. I guess you could use Glenn Butcher’s raw processor to test this, since it allows disabling the display-mapping transform.

1 Like

This all goes to something troy_s pointed out to me back in the “wild days”, that general-purpose LCD displays really do linear at the hardware level, and just have what amounts to “CRT emulation” in their circuitry to be compatible with that legacy. So, with LCD display profiles, we end up doing this game of “tone badminton” to take scene-linear data back and forth through these so-called “gamma transforms” to end up in the same linear domain on the hardware.

Given all that hoo-ha, to me the only meaningful use of the term “linear” is with regard to the original scene. Display engineers will probably beg to differ, as they do have to consider the linearity of their energy-producers, but for the imaging pipeline I think we need to focus on the impacts all our little tools have to those original measurements in the camera…

1 Like

It most definitely will. If necessary I’ll provide an example tonight, but easy ways to do this:
Export a TIFF with linear data. Use exiftool to remove the ICC profile, or view it with a non-color-managed application such as Imagemagick’s “display” tool.

Alternatively, if you export something with an sRGB transfer function as the encoding, and load it in something that misinterprets it as linear, it will look bright and washed out. This is much rarer, but I have seen some applications assume “floating point TIFF = linear”, ignoring the ICC profile.