DT sharpening is questionable

Well, Aurélien said at one point that all modules should eventually be moved before the tone mapper (filmic, of course). The reasoning was that once that’s done, you’d be able to instantly switch the output between SDR and HDR, for example.

I think the ‘only’ problem with that is you’d also need THE perfect tone mapper, which ‘understands’ human vision perfectly, and is thus able to generate the optimal output, perfectly suited to the output device’s capabilities.

3 Likes

I don’t think that is sufficient for all purposes, at least not without a very roundabout module workflow in scene-referred.

Consider, for example, local contrast enhancements and similar. In the very flat part of the tonal curve, especially the highlights but also the shadows, you need a much more aggressive transformation than in the midtones to get the same effect. Now I understand that you rarely want to get the same effect, the point is that you have to go out of your way to get some effect.

Of course one can do this with masks… but that takes a lot of manual fiddling around.

A (hypothetical) solution I could imagine is a 3-way tab for all modules to have varying effects depending on the tonal range. Currently only color balance rgb does this. But that could also lead to a very crowded GUI.

THE perfect tone mapper(:tm: in both senses) would understand human vision perfectly, so it’d be non-linear and know where and how to tweak the mapping. Of course I’m being a tiny bit sarcastic here.

4 Likes

Can contrast equalizer be used instead of local contrast? When LC could be better? Simpler interface?

@tankist02 Yes, it can. Select the setting deblur:fine:strength 3. Some say this is not a very good way to sharpen, but it definitely works.

I feel the same when it comes to sharpening in my G9 vs Darktable. But also mind you – in preview, the Diffuse or Sharpen module does its miracles on the preview, not on the image in its full resolution, so what you see during editing vs what you export can get very different.

That’s why you have the button to turn on full-resolution processing (or you can zoom in). It would also be important to enable high-quality resampling on the export module.

1 Like

I know it’s there. But I also don’t have a quantum computer to keep it turned on all the time. :smiley: We had discussed this topic before.

It is still slower but recent builds have had some code updates and right now DT seems quite a bit faster updating modules on my system anyway…

image

2 Likes

Even prior to the recent improvements on master, I find D&S manageable on 20Mp images using the CPU only in my not-so-powerful laptop.

I usually turn on these modules the last, after making all other corrections.

My interest in sharpening is declining at the same rate as my eyesight.
I’ve been to some exhibitions recently where the oversharpening has spoiled the photos. (Looking at many wildlife photographers in particular who seem obsessed about noise reduction and sharpening)
Global and well placed local contrast do more for me.

Not to mention good management of colour and tonality.
Here is a pic taken with my phone of a photo of polar bears that was exhibited recently in my area.

7 Likes

Is it a known fact ? Is it investigated ? Have someone post screenshots vs export ?
I’m quite disturb by the idea that darkroom view could wildly differ from the export …

Yes, it is known. You can:

  • zoom in to 100%
  • enable the high quality preview:
    image

Then, if you export at 100%, or at a lower resolution but with high-quality resampling enabled, you’d have (nearly) identical results.

It’s not only diffuse or sharpen, though.

Without zooming in to 100% / enabling the full-resolution preview processing, the image is scaled down to fit the view, and is processed that way. Pixel-wise operations will not work on individual pixels from the input, but on pixels derived from multiple input ones; operations that affect an area (e.g. a blur by 3 units, to give a simple example) will try to reduce the ‘size of the operation’ (e.g. blur radius) proportionally to how the image was downscaled, but this cannot always be done perfectly: if some operation can only be done with a radius of whole pixels, and you downscale 1:3, 3 pixels become 1; 2 should become 2/3, but will probably be rounded to 1; 1 should become 1/3, but may be rounded to 0 and omitted completely.

RawTherapee and ART also have such operations, marked with the 1:1 icon:

There have been some recent improvements, e.g. in this PR for color equalizer:

Slightly reduce saturations dependency on roi->scale.
Makes sure there are almost no visible differences in zoomed-out darkroom between HQ mode toggled on or not.
Also reduces subtle artifacts while non-HQ exporting with strong downscaling.

5 Likes

I’m not sure if you consider that an example of over-sharpening, but for me it looks nice and sharp, the narrow clear band around the bears is due to the back-lighting on the fur.

The banding around the sun is another story, that is something I would have tried to avoid (then again, my images are not selected for publication!)

No problem with the sharpening here of course but the banding is so distracting.

That sun is just awful!
(If that is a friend of yours, I didn’t say that :wink: )

Not a friend, the pic was taken at an exhibition of wildlife photographers. This photographer seems to be quite well known, sells books and all…

1 Like

It is known but not discussed enough. I’ve actually created an issue on GitHub some time ago (should you want to see some examples of the quirk). But it seems unlikely that it will get fixed (reworked), as that’s just how the module’s supposed to work.
Personally, I’ve returned to the contrast equalizer module. It does the job well enough and I can see the final result immediately.

I wouldn’t complain about the quality of a picture on an outdoor print, without having seen the original…

1 Like

Maybe this (the photo of the polar bears) is getting off-topic - or do we know for a fact it was processed using darktable?

1 Like