Exposure Fusion and Intel Neo drivers

Improper settings? OK, so - please tell me which settings of exposure EV and exposure bias in the exposure fusion mode of basecurve will not cause this:

[quote]
to turn into this:


(exposure fusion, three exposures, exposure shift +1.232, exposure bias +1.00, darktable master as of an hour or so ago - note that the highlights of the rocks on the upper right have been raised so high as to be nearly blown)

As opposed to this:


(same settings, but applying a 2.4 gamma curve immediately after the basecurve step of the exposure fusion pipeline, then returning to linear after fusion is completed, brightness of the rocks only raised slightly)

It isn’t exactly the best use case for exposure fusion (unfortunately, my current preferred test picture has clearly identifiable people I don’t think would want to be used for demonstration purposes in a public forum) as even after trying to make the algorithm less likely to blow highlights, it still blows them to some degree in the shown test image - but not as severely as the current hardcoded target of 0.54 linear does.

For reference, the test image comes from a rather low-end 360 degree camera. As a result the input dynamic range isn’t quite as great, so the potential benefits of exposure fusion are less, but it’s the test case that initially had me wondering why darktable’s exposure fusion implementation was so vastly inferior to feeding enfuse with three separate sRGB JPEGs despite claiming to be derived from the same algorithm.

I’ve never seen enfuse mess up colors when feeding it sRGB-encoded images. I assume you’re talking about what happens when you disable “preserve colors” in basecurve. Perhaps there’s some confusion resulting from the fact that the exposure fusion function in darktable has been “baked in” to basecurve?

Perhaps I should have been clear that this is what I did. However your assertion that every kind of masking/blending will fail is false for the algorithm implemented by enfuse (and implemented in darktable based on the algorithm described by Mertens et al - Exposure Fusion ) Calculating weights in gamma but masking/blending in linear gives horribly ugly results (haloing, etc.)

What an average APS DSLR or MILC can do is irrelevant when you’re delivering to an sRGB display, other than making the exposure fusion approach described in compressing dynamic range with exposure fusion | darktable possible. In fact, if you expose for highlights, then the rest of the scene will be severely underexposed without some method of bringing up the shadows - and darktable’s exposure fusion attempts to be one such method. The problem is that by operating in linear space instead of gamma-encoded, the results obtained aren’t even remotely what the algorithm was designed to do.

Darktable hardcodes a target brightness of 0.54 linear (where did this come from?) and a standard deviation in the weighting calculation of 0.5 (where did this come from? For reference, the original paper by Mertens et al uses a target of 0.5 and a std dev of 0.2). This results in, as shown above - everything getting pulled way up into the highlights instead of towards the midtones.

I’m going hiking tomorrow, hopefully I’ll come back with some better use case example images that don’t have people in. (I found my modified algorithm to be highly suited to images from a family wedding a few weeks ago - but as mentioned above, I don’t feel comfortable posting images from that wedding here.)

Got hints on reliable and consistent content delivery of stills to HDR10/HLG displays?

I’ve found no reliable way to do this. It looks gorgeous when you do (and eliminates the need for dynamic range compression tricks), but for the majority of displays out there, you need to encode your stills as a slideshow into a 10-bit HEVC video. Otherwise the display assumes it’s SDR content with an sRGB transfer curve and we’re back to square one. You’ve already established that modern cameras can capture much wider dynamic range in a single exposure than an sRGB JPEG can deliver to a typical user’s display.