Filmic: Differences in curves (3.2.1 vs 3.3.0/dev) and out-of-the-box curve in 3.3.0/dev.

Switching to “soft” the curve gets worse. The overshoot of the spline gets much stronger.

yes, that is my point. By selecting “hard”, you place a stiffer constraint on the spline, which will make this sort of overshoot much less likely.

Nope, they are at their default setting.

In 3.2.1 I sometimes change one or both to soft to get the result I’m after, but in 3.3.0 using soft isn’t at all workable.

I get the impression that there isn’t anything wrong/different with the underlying filmic engine itself, but that all this is a visualization problem. It is an important difference compared to 3.2.1 though; Me, and probably a lot of others, like to have some anchor/reference points I can count on when editing and changing these between version is rather annoying.

So, what is going to drive the shape of the spline is the hard/soft constraints at either end, and the contrast and the lattitude on the look tab, and the “hardness” aka gamma power.

I’m running 3.3.0+1046, and by default I’m getting default contrast of 1.350 latitude 25%, and the hardness is auto-calculated, and this seems to give good results. If you are using presets from another version, maybe it is pushing these numbers too far?

@Matt_Maguire: Please have a look at the xmps I provided: All the settings for 3.2.1 and 3.3.0 are the same and well within the bounds of “extreme”. The only 3 things I changed from its default setting:

  1. white relative exposure (+4.13)
  2. black relative exposure (-9.12)
  3. latitude (33%)

That’s it.

I opened your image with the 3.3.0 xmp on my 3.3.0+1046 version of darktable. I can see you have contrast 1.5, latitude 33%, auto-hardness enabled, hard constraints on spline for both shadows and highlights, and the spline looks fine to me. No sign of any overshoot.
filmic1 filmic2 filmic3

I’m actually using 3.3.0+1006~ge31529f63. Even just enabling filmic rgb or resetting the module (using the default settings, no presets) shows a curve with a (small) orange region for the blacks. Any tweaks of the “relative white exposure” to lower values make it worse.
As @Jade_NL I’m also asking myself if the visualization should just be ignored at this point? Or in other words : Is the spline shown in the graph only calculated for visualization or is it used for the mapping process itself.
What is bringing me to that question is the observation that swichtching from “hard” to “soft” changes the curve markedly, but no modification is visible in the histogram.
Or have we just to update to 3.3.0+1046?

The spline curve is used to actually map the tones. If you have orange bits, it means you’ll get some negative gradients in luminance in your image – how objectionable this is depends on the image. Histograms are a fairly blunt tool, and locked to 0-100%, so it is not surprising that it would be hard to notice any appreciable difference in the histogram display with these overshoots.

Thank you for clarification.

… should be avoided in tone mappings as I learned so far. But maybe I’ve overestimated the visualiziation in filmic at this point in the past. It’s the question of finding a balance of using technical tools and trusting personal impressions looking at the image.

I don’t know, it’s just very strange that with the same xmp file, we are seeing difference results in the filmic look curve. I didn’t see any relevant commits on filmicrgb between the darktable version you are using and the one I am using, so I’m really not sure why you are seeing these orange clipping indicators at all…

Really stange. I downloaded @Jade_NL’s image and xmp files and the look curves I get are similar to yours, no orange regions, different to @Jade_NL’s screenshots. Possibly he is using an older version?
But independent from that : just applying filmic with it’s defaults leads to a (small) “warning” in the look curve. And any decrease of “relative white exposure” enlarges this region.

filmic

I observe the same. Increasing the black relative exposure a little eliminates the shadows overshoot. I’m guessing that this could be due to the auto-hardness calculation.

Nope. One of the things I do every morning is pulling and rebuilding dt dev. So I’m on the very latest version (this is darktable 3.3.0+1046~g881097ceb)

I don’t find it very assuring that we are all seeing different results.

Here’s a little video I made.
dt.3.3.0.filmic.mkv (13.4 MB)

This is dt 3.3.0.

First cat shows the little orange warning. We seem to agree on that being there.

Both the second and third example (see below) have the same starting point as the first cat: Clean sheet.

The second cat is with the dt 3.2.1 xmp applied.
The third is the cat after manually applying the settings (1.46, 4.13, 1.5 etc) to both exposure and filmic.

The images speak for themselves…

Filmic default settings, for darktable master, in display tab have changed a bit : the black level used to be 0% and is now set to 0.015%.

The reason behind is sRGB output coded on 8 bits integers can only display about 12.7 EV of dynamic range. So, any EV below -12.7 EV at the output of filmic will end up rounded (because of the float → int conversion) to 0 at the output of the pipe, which means an infinite number of low EV could be rounded to 0, resulting in gradients loss, about which many users have complained.

But that higher black level makes it easier for the curve to undershoot, in the other hand. Worse case scenario, just set display black luminance back to 0%. Otherwise, try to reduce the contrast or change the balance, or adjust the display black level.

Here are other black clipping values:

  • sRGB (sRGB OETF) coded on 8 bits integers : anything below 0.015118% gets rounded to 0,
  • sRGB (sRGB OETF) coded on 16 bits integers : anything below 0.00006% gets rounded to 0,
  • Adobe RGB (power 2.2 OETF) coded on 8 bits integers : anything below 0.000110% gets rounded to 0,
  • Adobe RGB (power 2.2 OETF) coded on 16 bits integers : anything below 0.0000000006% gets rounded to 0,
  • any RGB (no OETF, linearly encoded) coded on 8 bits integers : anything below 0.1961% gets rounded to 0,
  • any RGB (no OETF, linearly encoded) coded on 14 bits integers (RAW photo) : anything below 0.003052% gets rounded to 0,
  • any RGB (no OETF, linearly encoded) coded on 16 bits integers : anything below 0.007630% gets rounded to 0.

So much for people who claim 16 bits encoding is overkill because [name here stupid reason].

The formula to compute the black clipping threshold for any bit depth and OETF is EOTF\left(\frac{0.5}{2^{bit \, depth} - 1}\right), where EOTF(x) = x^{2.2} for Adobe RGB or EOTF(x) = x / 12.97 for sRGB, assuming rounding is done to the nearest integer.

2 Likes

@anon41087856: Your post explains the (computer) science behind the changes and I’m not fighting that at all, on the contrary: I welcome it and I thank you for that info!

What your post does not explain is the visualization part. Why aren’t the visible curves also adjusted to the new, better, situation?

As I mentioned in one of my previous replies: I’m certainly not the only one that uses these curves to obtain a “sane” result (what else are all these numbers for if not to have some boundaries and anchors).

EDIT: Is the -12.7 EV you mention also the reason why the clipping is set to this number (-12.69 to be exact)?

ok, but you understand the white and black relative exposure need to be properly set for the image. You have the black point set at -9.21EV, which seems to me to be quite low for this image, and you have quite a high contrast in filmic at 1.5. I think this combination of settings is leading to this overshoot on the curve. If you want to increase the contrast on the cat, it may be better not to be so aggressive in filmic, and instead look at the local contrast module, or perhaps the contrast in the color balance module.

1 Like

Because there is nothing to adjust. The curve is designed to produce a smooth transition between black, grey and white levels, and that’s what it does. But setting black = 0% was nicer for the interpolation, as it produced less “tension” on the curve. Black > 0% is mathematically more challenging, because you put more constrains on the interpolation. It’s kind of like trying to fit a thick plastic sheet in a box, you need to squeeze and force it. Just try black = 0.001% and see how it helps getting rid of the undershoot.

Exactly.

1 Like

This isn’t about me being able to use filmic, but about the same settings in 2 dt versions showing very different results. @anon41087856 explained the science part of that and I do agree that master needs a somewhat different approach and getting used to.

And while replying to you Aurelien also answered the visualization issue. I do get the rationale behind that explanation. I’ll go and experiment with some different settings and see if making one of those an auto applied default makes sense.

You mean loading the same XMP in 2 different versions does not produce the same result ? That shouldn’t be the case. Only the default params have changed.

The 3.2.1 xmp was loaded via lighttable->history stack-> load … the 3.3.0 version was done manually to make sure I got the exact same numbers (see the video for the results).

BTW:

Do you mean setting target black luminance to 0.010 (!) in the filmic display tab? Values lower then 0.010 aren’t excepted. This does solve (all but) my visualization" issue… Thanks.

This fix should help:

4 Likes