Darktable 5.0: Blur using JPG source files still locks CPU + LightTable file view issues

Hi,

I posted up recently that Darktable was locking up my CPU to 100% for 45+ minutes when trying to save photos.

The kind folks here helped me narrow down my issue to the Blur function being unstable on 4.8.1, and that this should be resolved with version 5.0.

I’ve upgraded to 5.0 but I’m still getting the exact same issues:
CPU locked up @ 100% when trying to save files.

This time it’s only for c.3 minutes, but it’s still not working properly.

In fact, the first time I opened DT 5.0 and started editing a photo to test it out, after a few minutes of working with the Blur function, DT suddenly quit and closed down!

When I re-opened it, it refused to open that specific file I was working on.

I’ve tried deleting the xmp sidecar file, but it makes no difference.
It simply will not open that specific JPG any more.

I tried navigating to the relavent folder using the Import / Add to Library function in LightTable, but when I get to the folder location, although it’s full of JPGs, it says “No items match your search”…?!

Clearly the Import function isn’t for loading in photos…

Drag & drop (what I’ve always used) simply doesn’t work for this specific photo, although it does for any other photo around it in the same folder.

The photo I managed to save in a number of minutes, I wanted it to be slightly smaller.

So I went back to DT, but the photo had vanished from LightTable.
So I dragged it back again, altered the JPG compression settings and saved it again.

This time it took seconds to save, but then the photo was missing from LightTable again, and it had saved the xmp file into the finished photo folder, next to the saved photo…?!

And, again, when I try to drag & drop the photo back into LightTable, it simply will not load!

Wait a minute! Newsflash:
I just clicked on the filter tab in the top left of the folder window in LightTable, selected ‘Legacy’ and it showed no files atall.
I then re-selected ‘all images’, and ALL my photos reappeared!?!

So that’s not so bad, even though it appears to be a slight glitch…

Hmm…

Looks like DT 5.0 has a few bugs in it.
Any suggestions about what’s happening, or is it just bug testing for v5.0…?
Thanks. -D

Click on the image in lighttable view, the go to the actions on selected images module and remove it. This removes it from the database, but doesn’t remove the file. Re-import the image.

The image is gone from LightTable view!

go to Issues · darktable-org/darktable · GitHub and raise an issue so the developers can investigate.

So, on a 2 MPx JPG, with a GPU:

19.3929 [dev_pixelpipe] took 2.806 secs (2.814 CPU) [full] processed `blurs' on GPU, blended on GPU

The problem has nothing to do with JPG, as this is what I get with a 10 MPx raw at full resolution:

63.2122 [dev_pixelpipe] took 4.867 secs (4.905 CPU) [full] processed `blurs' on GPU, blended on GPU

Without GPU (2294x1417 preview on a 4K screen):

34.8965 [dev_pixelpipe] took 12.818 secs (149.287 CPU) [full] processed `blurs' on CPU, blended on CPU

At full resolution (3198x1976, this is an old camera):

165.8676 [dev_pixelpipe] took 48.600 secs (570.674 CPU) [full] processed `blurs' on CPU, blended on CPU

The module is, indeed, slow.
You do use brush masks, which is also slow (it’s better to use a path):

Note: Rendering a complex brush shape can consume a significant number of CPU cycles. Consider using the circle, ellipse or path shapes instead where possible.
(darktable user manual - drawn masks)

However, the measurement for the JPG above was done with the masks removed (and they were left intact for the measurements on the raw).

The module being computationally expensive does not mean it has a bug. And OpenCL acceleration clearly works (48.6 s vs 4.9 s). My GPU is an old NVidia 1060, my CPU is an AMD Ryzen 5 5600X.

Its slowness with increasing blur radius is documented:

caveats
This module is implemented using a “naive” convolution, which is a slow algorithm. Faster approaches are available (using FFT) but not yet implemented. The GPU implementation, through OpenCL, should hide this issue somewhat. In any case, the runtime of the module will increase with the square of the blur radius.
(darktable user manual - blurs)

Note the part: the runtime of the module will increase with the square of the blur radius. This means that increasing the blur radius 10x will increase runtime 100x, so it should really only be used with lower blur radii (and/or with a GPU, to mitigate the slowness via OpenCL).

You could try all of the following to blur:
image

Plus, there is lowpass and contrast equalizer.

This is diffuse or sharpen (top) vs your blur settings (bottom):

Source image:

On that particular image, diffuse or sharpen was about 10x as fast as blurs:

32.9825 [dev_pixelpipe] took 0.564 secs (0.489 CPU) [full] processed `diffuse.2' on GPU, blended on GPU

83.6654 [dev_pixelpipe] took 5.560 secs (5.577 CPU) [full] processed `blurs' on GPU, blended on GPU
2 Likes

And it also looks better, for those particular settings at least :smiley:

And of course, tiling can further slow down the process (also slower with larger blur radius, as you need more tiles…)

1 Like

And looking back the OP was not using hardware that would even come close to yours… so the times could be expected to be way higher…

There are lots of benchmarks but your cpu on a quick and dirty search is quoted at 22K for passmark score and the OP’s cpu is 3.4K on the windows 10 machine and less on the other older one…

See below from that post…

My PC tower is Windows 7 with 6 GB RAM and a AMD A6-3600 CPU @ 2.10 gHz.
It uses Darktable 3.8.1

I’ve tried installing 4.8.1 on my Windows 10 laptop, and it does exactly the same thing!
My laptop is Windows 10 with 16GB RAM and an i5-4340M CPU @ 2.9 gHz.

@Daf-T : is it possible to add a GPU to your machine? Nothing fancy (I have an NVidia 1060, at least 6 years old). It would give you a huge boost, given the rather limited CPU.

Hi Kofa (and all),

Thanks for the suggestions!
My laptop has an express card port, but I don’t know if they make graphics cards for express ports.

Using Diffuse or Sharpen could give me the same result, for sure!
As long as I can use a brush to select what does / does not get blurred.

The reason I don’t use the circle or elipse tool is because I’m often trying to blur out the background behind a person, so I have to track the edges of the person, which is never circular nor eliptical!

So I have to use multiple brushes to mask out the person, then blur everything else.

Your examples look very good, but you don’t have a foreground subject that you’re keeping in focus at the same time, which is what I do.

Maybe if I could cut out the subject and copy it to another layer on top, like I would in Photoshop, then ALL the photo could be blurred without any brushes.
But I don’t know if DT can do layers like PS, nor if it can erase away the parts I don’t want on the upper layer, to see the lower layer underneath…

The magic to masking is the refinement slider.

That has nothing to do with the blurs module.
As Mica mentioned, apply mask refinement.
See examples in recent threads, e.g.

Forget the brush (I mean, don’t rely on it as your only masking tool). I know you are familiar with it, but darktable can do so much more.

1 Like

Hey Kofa,

I’m trying your Diffuse / Sharpen method.
If I wanted to have less blurring effect, what would I reduce?
What’s the difference bewteen 1st / 2nd / 3rd / 4th Order Speed?
Should I just reduce all 4 equal amounts to get overall less blur, like reducing the Blur Radius in the Blur module…?

Thanks!

Start with reducing iterations.