DT 2.4.0 taking a really long time to export jpgs.

I’m sorry, but unlike what people might think, we devs don’t posses the mind-reading skill.
You did not specify which videocard you have, so it is impossible to answer.
But even then, likely no; strapping the rocket engine to the bicycle will not magically let it be faster.

Hi, sorry again, I thought it was in my paste from lshw above, but it wasn’t. There is only the on-board integrated intel graphics:

    *-display
         description: VGA compatible controller
         product: Haswell-ULT Integrated Graphics Controller
         vendor: Intel Corporation
         physical id: 2
         bus info: pci@0000:00:02.0
         version: 09
         width: 64 bits
         clock: 33MHz
         capabilities: vga_controller bus_master cap_list rom
         configuration: driver=i915 latency=0
         resources: irq:43 memory:e0000000-e03fffff memory:d0000000-dfffffff ioport:1800(size=64) memory:c0000-dffff

Is the ultimate answer simply that darktable 2.4.0 will be slow on this computer? What changed from previous versions to make this so? I can’t remember exactly which previous version I was using, but exports took maybe a minute or two for the same kinds of raw files on this same computer…

Then nope, i believe it will not be any faster.

It will depend on the exact history stack and images. Do follow the last portion of my first comment.

It isn’t helping that your computer is named peppy? :wink:

Ok, thank you! That gives me some things to go on for now. I will try these out in the next couple of days (have some other things to work on for the moment), and will report back about any improvements I am able to make.

@paperdigits Lol! “Peppy” is the code name for this venerable Acer C-720 Chromebook, which was really the first model we were able to port a fully functioning native installation of linux on (and not simply through the “crouton” chroot script). It’s still going after 5 years!! I guess it is time for an upgrade, though. Thinking very heavily of purchasing the Alpha Centurion Nano

1 Like

Oh, sorry, I missed your post while I was writing other posts, and I did not see this! Yes, I am trying to use the AMaZE demosaicking routine. I guess I should stick to a simpler one on this machine, and will kill all other procs.

EDIT: it looks like the longest time and largest number of tiles was for “atrous,” which I don’t know anything about. It’s not something I believe I enabled…

That is equalizer. It’s one of the heaviest/hungriest modules, so not surprising at all.

Aha!! That is really helpful! I have been using the equalizer recently (after reading some threads about it here on pixls), and I never used to use it before. I did not realize it was so intensive. I will go back to using “local contrast” and simpler noise removal and sharpening. Thanks!

Note that mode = local laplacian filter there is is comparably heavy.

1 Like

More than 7 mimutes for a less than 20 MPixel file? That’s really a lot of processing time…
Or did I miss something?

Thanks! This is all very helpful… For old hardware, I guess it’s very important to know which tools are heavy, and which are light.

@heckflosse Yes, more than 7 minutes for 20mpix. But I guess my hardware is barely adequate for this. :frowning:

I, too, have been pondering export times in darktable. Getting opencl working on my hardware made a significant difference, as did selecting “multiple GPUs” in the CL options.

I, too, have recently started using Equalizer, after this discussion on goto modules.

What has puzzled me the most is watching a file manager window of the folder into which I am exporting, and seeing the same jpeg produced multiple times: a slow crawl up to 30MB or so, then starting again. I’m not sure whether this is due to:

  1. Using the equalizer module

  2. Setting “multiple GPUs” in my opencl options

  3. Or some other reason, like the jpeg export quality setting (which I have at 96%)

Is any darktable expert able to shed some light on multiple exports of the same jpeg?

Chromebook isn’t exactly a power house to begin with either.

@darix Yes, I agree. But the initial reason I started this thread is that previously, export times were reasonable on exactly the same hardware. What I didn’t realize was the quite severe computational requirements of some of the tools that I have recently added to my processing pipeline due to the very excellent results I’ve seen them capable of (like in the thread linked by @martin.scharnke above). I do imagine, being in the FOSS world as we are, that there are probably a lot of people out there across the world on older hardware who would like to use a tool like darktable because of, among other things, their financial situation. While I am in the position to be able to upgrade (and indeed I likely will soon), many folks are not. So, it’s really good to know how to use darktable efficiently on underpowered hardware, and also good to know which tools one should avoid in those situations. This thread has been very informative for me in this regard.

2 Likes

I have never seen anything like that and am not aware of reports similar to yours.

  • What operating system are you using?
  • Could you please have a look on the terminal (or in the logs when on Windows) to see if darktable reports to have exported the image several times?

Just a wild guess: it might be that the configured tile size is too big, so instead of using tiling to keep everything in RAM the system starts swapping.

Ah - thanks; I have discovered that although I have assigned half my RAM (8 Gb of 16) in “host memory limit in MB for tiling” the "minimum amount of memory in MB for a single buffer in tiling takes precedence - at least, that is what the tool top says if it set to a positive non-zero number.

The trouble is, I cannot set this value to zero (or a negative value). Trying to overwrite it, or using the plus/minus buttons, I cannot get a lower value than 2. How to override this?

Maybe i just set it to a huge value - 1024MB?

No, no greater than 64MB allowed.
And no less than 2MB.

Hmmm…

That is what I wild-guessed as well. Anyway, I don’t use dt much but out of curiosity I have a copy handy on my win7 system (4GB RAM). I tried the equalizer module on DSCF1719.RAF (PlayRaw) using a preset with normal bounded the only history item. Sure enough the fan whirls and the processing takes a long time to complete. My fan never turns on using other apps that we talk about on this forum. I just want to confirm that @Isaac isn’t the only one encountering this problem, though for reasons I mentioned in another thread, I won’t be able to investigate further any time soon.

1 Like

Thanks @afre for the confirmation! I will be examining a good processing tool chain in darktable for underpowered machines. I might write a short blog post about it if I get time…