darktable very slow

Hi, I am not sure why this happens - is it a bug?

I am using the newest darktable 3.1 git master form the Opensuse repo. After some time of editing - maybe 1 hour or so, or at least having darktable open, darktable gets incredibly slow, no matter what I do.
If I type “darktable-cltest” in a terminal window, Opencl seems to be ok.

Using darktable 3.1 on a Linux Mint 20 machine, Intel i7 quad core processor, 16GB ram, dual graphics with Intel + Nvidia Geforce MX250.

The problem can only be solved if I restart the whole system. And then it comes back after having dt open for 1 hour.

I am not sure if this is an issue with my system (drivers?) or darktable.

Thanks in advance

b

Edit: maybe this is important: I am using Nvidia prime on-demand mode.

Wild guess: Is your computer heating up over time? It could be throttling the CPU/GPU. If it’s affecting the CPU, other apps would be slow too. If it is overheating that much, you’d probably hear the fan (if it’s working) and your computer would probably be quite warm to the touch.

Does restarting just darktable help? (I’m assuming no, because you mention restarting your whole system. Which is why I’m guessing it might be throttling related.)

(Or it could be some memory leak, some bug in the code, or something else.)

hi, thanks for the quick reply.
Well the fan is on all the time, the weather is quite warm, the laptop is warm all the time.
However, I installed Linux Mint yesterday and I did not notice the issue before. I also have Debian testing on the same machine where I am using bumblebee since Nvidia prime is not availabe on Debian. I used that system until yesterday. The weather was warm before yesterday too. It’s summer. I have no air condition.

Yeah, no air conditioning here as well. Sometimes heat does funny things to electronic devices, especially if they’re not cooled enough.

(I had a cable modem that would make the Internet really flaky on hot days, for example. I also had a phone that would easily overheat and throttle until I cooled it down by placing it on my kitchen counter or a glass table for a few minutes. Really, any hardware can act up with too high or too low temperatures.)

I did see something with NVidia settings on Mint specifically that was about overheating. It’s not 1:1, but probably close enough to possibly have some useful bit of information about configuring your system: (RESOLVED) overheating problem with Nvidia Card - Linux Mint Forums (It talks about X11, which I guess you’re using as you have an NVidia — and we all kind of need X11 for accurate color profiling anyway, instead of Wayland, which doesn’t seem to support monitor color profiling well enough for darktable yet.)

FWIW: I’m currently on Fedora 32 with an Intel GPU + Neo for CL, but have been “doing computer stuff” for a few decades. And it does sound like overheating to me. :wink:

Hopefully this helps! Even if not, I hope you can find a solution soon.

Oh, right; we can talk about the hardware side of things too: Make sure your computer’s fans are clean that your computer is well ventilated (without much stuff around it). For some laptops, they need to be set flat on a desk. There are cooling stands as well (some passive and some even active, with additional fans).

I had similar problems with nvidia prime, like my GPU not being available anymore after a while.
I solved it by running the nvidia-persistenced service (I start it when I start a darktable session).
This one keeps the prevent the driver to release the GPU state when it’s not in use.

I think I know what is causing this @anon41087856 : I am experimenting with the new highlights reconstruction in Filmic. If I set back Filmic highlights reconstruction to default settings the speed is back. It’s the same thing on Debian. Apparently it has 0 to do with heat or throttling.

Filmic highlights reconstruction basically performs one wavelet decomposition for each iteration of reconstruction. So it is indeed much heavier. However, the highlights threshold for reconstruction is an user param, and if no pixel is found above that threshold, the reconstruction is skipped entirely. So to disable reconstruction and restore performance, you can only set the highlights threshold to the maximum value. Ultimately, display the highlight clipping mask to see what is considered for reconstruction.

Result is good though…

Cool. So it’s a trade-off. Good things come at a price :wink:

2 Likes

@betazoid, it might still be worth delving into the slowness. I’ve been using filmic4 quite a lot and never had the severe slowness you mention. The reconstruction clearly takes a while, but is not a big problem for me. The sys monitor shows the memory use reaching 12.2Gb on my system, so is it possible a load of swapping is happening on your system? (Though the 12.2 is with a 50Mpxl raw)

One thing I’ve been trying where there are burnt out areas is to do most of the filmic settings with Iterations set at zero, for speed, then set 2 iterations finally to get the best result, hopefully. Two iterations needs twice the time taken for no iterations when outputting.

I’ve yet to try the “add noise in highlights” feature…