Clarity in darktable

I agree with your observation…I am just trying to understand it based on this statement…

" Because the multiplication is commutative (output * input = input * output ), the effect inside the masked region is the same for the multiply and multiply reverse blending modes. The outside of the masked region will change as it will depend on which image is the base image:" from one of the developers that wrote the code…

Not sure about reverse subtract …maybe try that on a more colorful image I think from what it did to the color checker there will be major color artifacts…maybe not so evident at low opacity though…I have been using subtract with local contrast and a linear tone curve for some time…5%~…it does make for a nice enhancement and provides a dehazing like effect without too much effect on color…maybe deepens them…I guess I will need to experiment some more…

EDIT Maybe its like the input output saturation sliders in color balance so when you drop opacity and the blend order is reversed you are adding multiply to a reduced opacity input and so the effect is less overwhelming darkening everything and thus sharper and bringing out the darkest parts a bit more. .So at 100% opacity they would be similar or very much so but not at lower opacity…does that make sense maybe??

1 Like

Wow, I had to read the explanation several times. Mask is reversed (black is mask) and the reverse result has the wrong colour. The language is confusing too but English may be a second or third language for the author.

Old substract mode behaves exactly the way you describes here. New ones are changed. It will bei interesting to see what is behind that. :slightly_smiling_face:

I did also note that the effect was not as strong if I did the same thing using the rgb tone curve vs the old tone curve module as well…

1 Like

What I’ve readen is:
When you use a reverse blend mode the final image is mixed with the image with the module applied in “normal” mode.
If you use a known blend mode, for example “multiply”, the final image is blended with the input image of the module.
If the opacity is set to 100% “multiply” and “multiply reverse” gives the same results (without masks).

Use a landscape picture (for example) and in Local contrast, apply a multiply blending with opacity of 30%. And change from multiply to multiply reverse. You will see the difference

Agreed I wasn’t taking into account opacity…

I have been wondering if the observed result is produced by this sort of arrangement

“Maybe its like the input / output saturation sliders in color balance, ie you can reduce a bit the input before calculation. So when you drop opacity and the blend order is reversed then the masking now on the input image, so you are blending the output in multiply to a reduced opacity input and so the effect is a less overwhelming darkening of everything and thus sharper and bringing out the darkest parts a bit more. .So at 100% opacity they would be similar or very much so but not at lower opacity…does that make sense maybe??”"

Clearly I should have remembered this…

1 Like

We have to tattoo it in our arm! hahaha

Experience is the best reminder…I have quite a few presets with low opacity multiply so I just use them….if I had experimented more with the new one I would have likely discovered the nuance of it. I love your edits by the way. I think that you and Boris are great ambassadors for the software. There is a lot of math and technical talk around DT but you and Boris put some creative and artistic flare into it……

3 Likes

The problem of “reverse” blend modes in scene referred is that they respond in a strange way to masks.
Unless I am missing something, there is no way to restrict the effect to a part of the image with a drawn or parametric mask, some effect is applied also outside the mask.

2 Likes

I think only reverese multiply is doing that the other two don’t seem to …ie divide and substract…at least in my limited experience

https://github.com/darktable-org/darktable/issues/7464#issuecomment-766158145

I’m “studying” the code where these blending modes are done “blendif_rgb_jzczhz.c” but I have some difficulties to understand everything.

  • I see that “blend fulcrum” slider is related with the “p” value in the functions (but I didn’t find where it’s created this “p”.
  • In “blend_subtract” and “blend_subtract_reverse” the “b” array goes till “j+3” and in others goes to “j+DT_BLEND…”

I’m not a software man as you can see. If I understand everything I will try to find where the problems are. Because I’ve detected they the mask is lost when I do some operations (I have to identify exactly when it happens)

To understand why a mask doesn’t seem to be effective, you need to understand the architecture of how blending modes are working. REfer to the following diagram in the user manual:
https://www.darktable.org/usermanual/en/darkroom/processing-modules-and-pixelpipe/the-anatomy-of-a-module/

This shows a case where, for example, we have a normal “Multiply” operation. See the slider at the top right, it takes a copy of the input image, and a copy of the output of the module+blending operator. The position of the slider for each pixel slider is then controlled by the value of the mask at that pixel position. If the opacity is set to 0, you get 100% the input image, and if the opacity is 1 you get 100% of the output from the module+blend operation.

In the case of multiply reverse, the top of the slider doesn’t connect to the input image, it cnnects to the output of the module, just before the blend operation. This means the final output of the slider is a mix between the module output (eg. local contrast enhancement, for the local contrast module) and the output of the blend operator (eg. multiplying the pixel from the input image with the pixel from the output of the contrast enhancement, in the case of a multiply blending operation). In this case, both sides of the slider input are coming from the output side of the module, which means that the effect of the module (eg. local contrast) will be present in some form regardless of where the mask sets the slider.

Probably we should provide a second copy of that diagram showing the logic of the “reverse” blending modes. The documentation is a bit vague on blending modes in general, passing the buck to the GIMP documentation. I think this was an attempt to keep the documentation from growing too much and being difficult to maintain as new blending modes are added, but it does make this aspect of the darktable behaviour more opaque.

1 Like

Still subtract and divide should show this and they don’t. The author admits they are more inverse than reverse so it’s not perfectly clear…

Subtract works like this:

BLEND_SUBTRACT:
      final_output = input * (1.0f - opacity) + fmax(input - blend_parameter * module_out, 0.0f) * opacity

BLEND_SUBTRACT_REVERSE:
      final_output = input * (1.0f - opacity) + fmax(module_out - blend_parameter * input, 0.0f) * opacity

Divide works like this:

BLEND_DIVIDE:
      final_output = input * (1.0f - opacity) + input / fmax(module_out * blend_parameter, 1e-6f) * opacity

BLEND_DIVIDE_REVERSE:
      final_output = input * (1.0f - opacity) + module_out / fmax(input * blend_parameter, 1e-6f) * opacity

Like I said, I think this sort of information should properly be captured in the user manual.

1 Like

I was not disputing the formula’s merely pointing out that with reverse multiply you get an effect outside the mask due to the layer that is blended but with the other 2 “reverse” modes don’t behave that way and so the author proposed that maybe they should be renamed as it was misleading somewhat…Maybe the situation has been updated…

I have found it.
I wanted to say that there is a bug with the masks and I know now when it happens:

  • activate local contrast module
  • create a mask RGB (scene)
  • deactivate local contrast module
  • activate other module (velvia for example)
  • reactivate local contrast
  • …the mask is lost. And if you try to paint, automatically the mask GUI is deactivated

I think that it’s a bug. And I’m trying to find the origin. But maybe is a too difficult task for me.

2 Likes

The recent User Manual has an entry:
“modules to avoid”

The lines below are from there:
“There are a number of modules which are no longer recommended for use within a scene-referred workflow”.

I noticed that in your workflow a scene referred operation was followed by a module that was labelled as one to be avoided.

I have no idea about the fine technicalities of the issue but I wonder if the cause of your problem has something to do with the above mismatch.

„not recommended to use“ doesn’t imply that this module messes up everything. You need to know what you‘re doing - it can cause a messup for one image and be fine for another…
If something was messed no one can give help without having the image and xmp :wink:

2 Likes

It just means using these modules might introduce artifact as they require values to be in the range of 0 to 1.

1 Like