Exposure Fusion and Intel Neo drivers

The new module I’m working on (remember, that was the topic of this thread) makes a dodging and burning mask from a guided filter in 50 ms. Every other method will ruin colours in unpredictible ways.

Because of 10 years of legacy to stay compatible with. You don’t have to use every single module in the soft. It will go out for darktable 3.x branch if we decide to break up compatibility with the 2.x branch (which would be sane, I guess).

You don’t get, don’t you ? Working gamma encoded stuff (aka display-referred) is what Photoshop, Lightroom and all the legacy image-processing stack is doing. Guess what ? It sucks at HDR ! Why ? Because it breaks light transport models, so it messes up colours and produces halos while blending. So, instead of patching rotten algos from another era, let’s embrace the future and get a chance to do things properly, in scene-referred space, with linear operators, where 18% or 50% are just regular luminances.

Because, yes, I have spotted halos in https://mericam.github.io/papers/exposure_fusion_reduced.pdf (p.6), and I told you they would happen even before looking at the paper (sure, they hide them to some extent by working on a multiscale pyramid, but that doesn’t make it right). Get your theory right, then unroll the algo, not the other way around.