darktable speed (in general, and when using two monitors)

Yes

So I really need your help with removing the program.

Yesterday, I’ve downloaded phoronix test suite, and ran some batch files. A cmd window popped up, did some things and closed down. I’ve installed Cygwyn as well (some kind of Linux emulator?). I want to completely remove all of the components from my system, because I have 0 faith in this and I’m unfamiliar with all of those things. The only reason I downloaded and messed around with those programs was to “help” you guys get some test results. Apparently I’m too stupid to figure this out, and the guidance in this thread is not enough to make it work.
I’m kind of fed up with this whole process right now, so I’d like to know how I can remove EVERYTHING that got installed yesterday without having to do a system restore (because that will really piss me off).

please help.

Greetings,

Rico

I tried one final time:
System cannot find the path.

It seems you didn’t put the quotes that @Claes used.

2 Likes

Unfortunately I tried both and both didn’t work.

When I enter the \users\my name\desktop\bench path, and then type in the command program files darktable bin etc., it just doesn’t show.

Same goes with me just using cd and then starting with C: etc. Both options don’t work. I can access the paths seperately, so up to the bin but then nothing works anymore.

I can run the darktable-cli inside the darktable bin folder itself in windows, but I can’t run it in the cmd. There is no way for me to try out that bench thing either.

Guess I need actual screenshots and/or a tutorial video from someone showing me the exact same steps. Otherwise I’ll stop trying.

Darktable is extremely slow while running OBS and editing a RAW file.
It’s pretty alright if I edit a jpg
It’s more than fine if I close OBS and edit a RAW file but then I don’t have anything to capture my screen with.

By quotes I mean actual quotation marks. Because the file path has a space in it, windows thinks that ‘C:\Program’ is the application and ‘Files\darktable\bin\darktable-cli’ is the first argument to the app. To get around this you need to wrap the file path in quotes if it has spaces.
“C:\Program Files\darktable\bin\darktable-cli” bench.SRW test.jpg --core -d perf -d opencl

2 Likes

I considered @aadm’s speedup suggestion and wondered if it was at least partially related to nuking the darktable’s configs.

As it would take several days for me to reimport all of my photos, I really didn’t want to wipe out my database and start over.

I’m very happy to report, at least in my case, I found a few major speedups that helped me immensely (most likely in order of speed wins):

  • Back up your current library, then defrag and compact the database with vacuum (mine went from 201 MiB to 177 MiB):

    cp library.db library-`date -I`.db
    sqlite3 library.db 'VACUUM;'
    
  • Move darktable configs out of the way (while darktable isn’t running) and let darktable create new ones by running it again afterward (this moves it to a file with the current date, in ISO format):

    mv darktablerc darktablerc-`date -I`
    

    Having darktable recreate your configuration file means that you’ll lose a few settings. One of the most important ones are which modules are turned on and favorited. You can grep through your old configuration file and copy/paste the lines into your new darktablerc. (First make sure darktable is not running, of course.)

    Here’s a one-liner that will do this for you:

    grep -i -P "plugin.*(favorite|visible)=true" darktablerc-`date -I` >> darktablerc
    

    (If you want to also preserve the disabled modules too, just remove the true part of the command above.)

    Pretty much everything else is a preference. You can compare the files and copy over the parts you want or just toggle it in the UI after starting darktable again.

  • Be selective of Lua plugins you choose to enable.

    • In my case, I had image_path_in_ui turned on, and it seems to be extremely slow — probably even checking the database and file on every hover in lighttable mode. This was what made darktable almost unbearably slow when going through photos.
    • Instead, I found that there’s a relatively newly updated OpenInExplorer plugin which not just works on Windows, but also on Linux (with Nautilus). As I wanted the path for the image to open it up in Nautilus anyway, this new one was a huge performance boost (as it only runs when I want it to) and even made my life a little better too. :wink:

Happy photo editing, everyone!

2 Likes

Good detective work @garrettn

I recently thought my darktable was running a bit slow. My editing machine is not connected to the network, and hasn’t been for over a year. I had plugged it in and updated a few weeks ago. Turns out my slowness was coming from the latest speculative execution patches to the kernel. :frowning:

Figured I would add some results to this just because it is easy to do :smiley:

These results are from a clean install of everything. Testing out Ubuntu 19.04.
Running the darktable 2.6.2 ppa. No files imported yet so running what I would assume is a clean db minus a few setting tweaks.

My results seem quite comparable to what others have found.

Ryzen 7 2700 3.20ghz (stock clock speed)
Nvidia 980TI using the latest 418 driver in ubuntu
16GB Ram running in dual channel clocked at 3000

OpenCL:      3.307 secs
CPU Only:   13.110 secs

I know I’m late to the party on this thread, but this really caught my eye with DT. As background I have been a long time user of Rawtherapee and have been using DT a lot of late due to the masking tools etc that RT lacks.

While using DT generally isn’t too bad, one thing I really noticed was the lack of fluidity or general smoothness when using the curve tools in DT over RT. Having used DT quite a bit I found that for many images I didn’t really need the mask tools, so moved back to RT to find adjusting the tone curve in RT a sheer delight.

For what it’s worth I’ve been using DT with both opencl and without it and found the same general “roughness” to the experience of adjusting with the curve modules.

Peter.

1 Like

@plaven
Never too late. I am also on for performance…

What is that “;” after VACUUM used for?

VACUUM; is a special type of SQL command – all SQL commands need to be terminated by a semicolon (;).

GTX1060 and Threadripper 1950x. 64GB ram. Ubuntu 18.04.

GPU: 5,886015 [dev_process_export] pixel pipeline processing took 5,206 secs (49,141 CPU)
CPU: 9,813531 [dev_process_export] pixel pipeline processing took 9,248 secs (242,660 CPU)

I changed opencl_memory_headroom=300 to 1200 and got:
5,198685 [dev_process_export] pixel pipeline processing took 4,478 secs (22,712 CPU)

1 Like

Digging up this old thread. Hope the benchmark is still relevant for DT 3.0.1

I have a 3rd gen i5, 16gb ram and a gtx 1050 Ti. I got 14.358s in opencl, and 36.189s using only cpu.

OpenCL -

CPU Only -

Is adding a faster gpu likely to make a significant difference? Something like a 2060 maybe? Currently I find the slowest part of DT is parametric masks.

Yes.

Last time I clocked darktable using a GTX-1050 and a Ryzen 2700X CPU I got this result

7.540 seconds with openCL
11.020 seconds without openCL

Since then I have upgraded the machine to a GTX -1660 Ti and a Ryzen 3900X.
Now I get this result:

2.781 seconds with openCL
8.521 seconds without openCL

Have fun!
Claes in Lund, Sweden

Since I have noticed a marked improvement of the results using the latest Darktable 3.0.1 especially with regards to using CPU only, I will add these results to the thread.

I’m using the same laptop as in my original post up here (Dell XPS-15, i7-7700HQ@2.8Ghz, Geforce GTX1050, 16 Gb ram, 512 Gb ssd):

With GPU:

$ darktable-cli bench.SRW test.jpg --core -d perf -d opencl
[...]
12,562578 [dev_process_export] pixel pipeline processing took 11,683 secs (41,596 CPU)
12,958019 [opencl_summary_statistics] device 'GeForce GTX 1050' (0): 543 out of 544 events were successful and 1 events lost

Only CPU:

$ darktable-cli bench.SRW test.jpg --core -d perf --disable-opencl
[...]
23,373814 [dev_process_export] pixel pipeline processing took 22,616 secs (173,909 CPU)

In summary (old results in brackets):

  • CPU: 22.616 s (80 s)
  • GPU: 11.683 s (13 s)

That’s a sweet upgrade.

Has anyone checked the diff made by only changing the gpu?

Has anyone checked the diff made by only changing the gpu?

Of course!

Using GTX-1050 = 7.540/11.020 seconds (with openCL/without openCL)
Using GTX-1660 = 3.078/11.091 seconds (with openCL/without openCL)

I have no GPU and only i5 4th gen with only Intel 500 graphics HD and on Linux, I have just see significant speed by changing kernel release from 5.3.x to 5.4.x. I’ve tried also 5.5.x (better than 5.3) but find 5.4.x the faster one with darktable. If you use Linux, considering which kernel you use seems also a good thing. Of course, it’s just my using so could be different on a different PC.

I have the impression that most performance issues discussed here are focused on the processing times of the pixel pipeline.
Does anyone experience UI lags during processing in the Darkroom?
Especially on complex edits, as soon as a parameter of some module is changed, the whole GUI becomes very unresponsive or freezes until the processing is done. This makes editing images quite frustrating because even simple adjustments like dragging a slider or curve nodes are next to impossible.
Is it only me who is experiencing this? I wonder if I’m doing something wrong.
(Using current master build on Arch; tried with and without Opencl)
I’ve also created an issue in the bug tracker in case someone is interested: