[Play_Raw] Dynamic Range Management

In this case, yes, the EV shift is just a multiplier.

However fusion takes multiple images generated with an EV shift and blends them together, based on a set of calculated weights. What I’ve seen is that if things DO break, they happen when the generated images are “too far” apart. (In the “traditional” enfuse approach, this would be if you set your bracketing steps too far apart.)

I haven’t abused the original enfuse implementation yet to see how it behaves in such situations - maybe this weekend I’ll get some time to poke at it again? Interestingly, Google has been cited as performing their algorithm slightly differently than the darktable fusion approach here - instead of generating 3 or more images and blending them together with weighting, Google is supposedly generating one image with +EV shift, fusing it with the base image, then iteratively doing the same thing a few times (so at all times only two images are being fused with each other, but the “base” image in successive iterations has had its dynamic range compressed with previous fusion steps.

I did find a potential difference between darktable’s blending and enfuse’s that may be a contributor to some of the oddities observed:

In enfuse, the mask weights are normalized (such that the total for of the weights for any given pixel is 1.0) before the gaussian pyramid of masks is generated.

In darktable, this appears to be happening after the GP is generated.