If you hover your mouse over the “Black relative exposure” slider, the info says “increase to get more contrast.”
I just meant that the sliders do have an effect on global contrast.
This has also been described in the main Filmic discussion:
Also, sorry for the misunderstanding.
Concerning the core issue:
I understand the point, also due to your deeper explanation in the post you linked. Also, I don’t reject artistic decisions.
Still, I feel like something is missing here. In the old days, each film type had a fixed dynamic range. Of course, you could dodge and burn etc., but this was based on the “reliable” behavior of the film. Now, the whole concept of mapping is focused on “holding on to” and compressing luminance values above display range, but it is clueless about luminance values that are within/below display range (low-contrast) even before applying Filmic.
This is exactly what I mean: When we are not trying to rein in extreme luminances, the “enough” should have a “normal” reference.
Something like the camera dynamic range or the film dynamic range is just missing in the concept. Neither turning Filmic off nor using it in its “default” settings is the right answer: Without filmic, there is no minimum compression (and de-saturation) when approaching the edges, while the default settings may not be too close to the camera range (or something of the type).
I get that. Theoretically, the old film process would imply that in our digital workflow, a first compression (including some sort of fixing measured values) would come before modification steps in the pipeline, and then another, in the sense of paper printing.
(Image source: Death of the Zone System (Part V) — Gordon Arkenberg)
But I guess that is stupid because you wouldn’t want to compress your data unnecessarily.
Don’t worry, that is really not what I have in mind.