Problem exporting

I’m on the cell this week due to work travel. Reading the log in a cell can be a challenge.

I see the exposure module taking 10s and it is using tiling. This hints at a low memory scenario. The image is broken down into multiple tile, each tile is processed and then the results from the tiles are stocked together.

It also looks like the log stops once it goes in processing eh D&S module. I assuming that the process is still running in the background. Try letting the system process for 10-15min. You can also disable the opencl.

Can you post darktable -d opencl -d tile?

Thank you for your help, I set it to export and left it running. It was able to successfully create a jpg after 3 hours. I asked another photographer with a better computer (including 64gb of RAM) and it took 15 minutes to export. Another photo took two hours on my computer, I agree it is memory limiting my computer.

even 15 min seems extreme… I’d be curious to see that xmp file to see what might be taking so long…

I think, i have a similar problem in a different environment.
I use dt 4.4.0 on linux Mint since about one year. Both under windows 10 and linux i had (have) problems while exporting.
I select several raws and start exporting. It begins and after a few exports dt stops exporting and dt ends. Not always after the same number of pictures. Sometimes I cannot go on. Always on the same raw the system hangs. No error message or something else. Also a restart doesn’t bring any changes.
In the beginning I often do a redevelopment of the raw. During the last weeks i have the impression this error occurs when cutting in free proportions. When I do a frame with the frame module around the picture the export runs without problems.
But this is only my impression. Not verified. I think I should observe this.

15 minutes?! My exports usually take seconds. With OpenCL disabled, that would mean a minute or two.

I’m going to assume you have a large raw file plus. There is not enough video memory, thus forcing a lot of tiles.

In this case, you should try with opencl off to use only the CPU path.

According to the log uploaded earlier, it’s a 24 MPx image:

30.4241 [dt_imageio_export] [export] imgid 14550, 6064x4040 --> 6064x4040 (scale 1.000000). upscale=yes, hq=yes

The pipeline steps are nothing extraordinary, based on the completed thumbnail pipeline in the log.


pic_1.NEF (28.1 MB)
pic_1.NEF.xmp (19.1 KB)

I have uploaded an example that took over two hours to finish exporting when the diffuse or sharpen module is used. I am curious if it is a result of how I edit the photo or if others have similar export times?

CC BY-NC-SA 4.0

Unfortunately, the RAW image file hasn’t been uploaded correctly - so unable to download to check.
The xmp file is OK.

Thanks for letting me know, I have uploaded it again.
I also tried with the OpenCL setting disabled, and was able to export a photo with diffuse & sharpen module in 4 minutes, rather than multiple hours. Success!

Linux, optimised build from the master branch, Ryzen 5 5600, 64 GB RAM, NVidia 1060 / 6 GB:

With OpenCL:

63.9143 [dev_process_export] pixel pipeline processing took 5.796 secs (5.867 CPU)
...
 [opencl_summary_statistics] device 'NVIDIA CUDA NVIDIA GeForce GTX 1060 6GB' (0): peak memory usage 4422928000 bytes (4218.0 MB)

Without OpenCL:

49.0822 [dev_process_export] pixel pipeline processing took 16.135 secs (168.376 CPU)

Windows laptop, one of the pre-built binaries (either a ‘weekly’ build or the nightly snapshot from GitLab, I don’t remember). i5-10210U, 16 GB RAM, no dedicated GPU (only the ‘Intel UHD Graphics’ on the CPU). Note that the Linux / NVidia OpenCL numbers above were achieved with tuned settings (all operations on the GPU, lots of memory dedicated the darktable’s processing), while those below use default settings for everything, as I don’t really use darktable on the laptop.

With OpenCL (clearly not worth it):

328.2157 [dev_process_export] pixel pipeline processing took 153.791 secs (130.266 CPU)

Without OpenCL:

116.7420 [dev_process_export] pixel pipeline processing took 47.722 secs (312.781 CPU)

The large radius used in diffuse or sharpen means that if tiling (darktable’s solution for low memory situations) kicks in, a lot of calculations are repeated over and over. In such situations, the best performance is achieved by reducing tiling (using more memory), so if the GPU is starved for memory, but the CPU is not, the latter will win.