Filmic default settings, for darktable master, in display tab have changed a bit : the black level used to be 0% and is now set to 0.015%.
The reason behind is sRGB output coded on 8 bits integers can only display about 12.7 EV of dynamic range. So, any EV below -12.7 EV at the output of filmic will end up rounded (because of the float → int conversion) to 0 at the output of the pipe, which means an infinite number of low EV could be rounded to 0, resulting in gradients loss, about which many users have complained.
But that higher black level makes it easier for the curve to undershoot, in the other hand. Worse case scenario, just set display black luminance back to 0%. Otherwise, try to reduce the contrast or change the balance, or adjust the display black level.
Here are other black clipping values:
- sRGB (sRGB OETF) coded on 8 bits integers : anything below 0.015118% gets rounded to 0,
- sRGB (sRGB OETF) coded on 16 bits integers : anything below 0.00006% gets rounded to 0,
- Adobe RGB (power 2.2 OETF) coded on 8 bits integers : anything below 0.000110% gets rounded to 0,
- Adobe RGB (power 2.2 OETF) coded on 16 bits integers : anything below 0.0000000006% gets rounded to 0,
- any RGB (no OETF, linearly encoded) coded on 8 bits integers : anything below 0.1961% gets rounded to 0,
- any RGB (no OETF, linearly encoded) coded on 14 bits integers (RAW photo) : anything below 0.003052% gets rounded to 0,
- any RGB (no OETF, linearly encoded) coded on 16 bits integers : anything below 0.007630% gets rounded to 0.
So much for people who claim 16 bits encoding is overkill because [name here stupid reason].
The formula to compute the black clipping threshold for any bit depth and OETF is EOTF\left(\frac{0.5}{2^{bit \, depth} - 1}\right), where EOTF(x) = x^{2.2} for Adobe RGB or EOTF(x) = x / 12.97 for sRGB, assuming rounding is done to the nearest integer.