Memory usage during export

Hi.
While exporting 35 images, I closed Firefox and noticed that darktable didn’t profited from the newly freed up memory.
Is this by design? Shouldn’t it try to use all available memory dynamically?

In the picture above, the green indicator at top represents memory usage. The drop corresponds to Firefox shutdown. After that, daktable didn’t try to use the available memory

I’d venture a guess that exporting an image is CPU/GPU bound, not memory bound, especially if your system has 16 GB RAM or more.

If you want to informally check that, do something CPU intensive, like transcoding a video with ffmpeg while trying to export

I forgot to mention, my system has 12 GB of ram, and in my case, just CPU bound, because opencl is not enabled.
I’ll try that when I can, thanks.

I’d think that 12 GB RAM is probably enough. It looks like DT exports one image at time, which would explain why it doesn’t fill your RAM. From the terminal, you could use GNU parallel with darktable-cli if you want to use all your system resources :stuck_out_tongue:

Why should it?

To maximize the usage of available resources?

Memory isn’t like CPU, where one wants to go as fast as one can. You only malloc the memory you need, and no more…

3 Likes

I know of some products that start by allocating all the memory available and then freeing a small amount for system use. The results are often not pretty.

Often that means to reduce memory (RAM) usage to increase L1/2/3 cache usage :wink:

ok, thanks guys for showing me the way it works :+1:

Is this a bookmark for the dumbiest question on the forum, by chance? :blush:

image