Darktable hangs in lighttable > import > either option

Been doing so since using it after update of Arch (currently 6.2.7-arch1-1) on Saturday evening, so probably Sunday or Monday.

Lots of coincidences, so not convinced it is a darktable problem, simply trying here to narrow it down. I update Arch usually every Sunday but did so on Saturday this week. So whatever happened could have happened any time during the past ten days or so. I note there was a DT updated release 10 days ago.

But there have also been problems seen reported online regarding Gnome Desktop Manager, my GUI handler of choice. Also, glitches can and do happen with a release update. So it could be Arch or the Linux kernel itself.

Waded through the journal entries for the system, couldn’t see anything obvious there but then I’m no expert. Hundreds of entries for a GDM glitch but nothing to say what it is affecting other than being a nuisance.

Ran darktable -d all which gave out all the entries for /usr/src/debug/darktable/darktable-4.2.1 without revealing any clues other than this, the final line, which may or may not be useful:

66.065746 [sql] /usr/src/debug/darktable/darktable-4.2.1/src/common/database.c:3838, function dt_database_optimize(): exec "PRAGMA optimize"

Any useful pointers gratefully received. I’ve loaded thousands of images onto this, my newly built (end Nov - early Dec) midi tower PC and was happily pushing them into the library when this fault occurred.

The log line is your database getting optimized. Killing darktable while that is running isn’t great. Depending on the size of your db it could take quite a few minutes.

You can turn off or change the the optimization in the preferences.

Thanks @paperdigits. I wanted to add images to the darktable library, not to kill darktable so this still leaves me bemused. Nothing like it has never occurred in the past with DT.

I have preferences > storage > database set to once a week so it should be happening, umm, perhaps, weekly? and not timed to only when I want to import another batch of images?

The system only hangs on that instance, click on the import facility. Clicking on the darktable icon at the top of the screen, then quit, brings the screen back to life so I can do anything else in DT. I simply can’t use the import facility.

I have subsequently discovered that I can add them one at a time by using the system’s file manager to open them in DT and they stay there, within the DT library, all in the same date-oriented file. All the darkroom modules work perfectly well with images imported that way. But trying to open multiple images simultaneously that way crashes DT.

My suspicion remains that the problem is either with Gnome and its Display Manager or, less likely, with the compilation of DT v.4.2.1 for inclusion in the Arch repos. Or even with GDM’s compilation into Arch.

Having said all that, I can’t find any evidence of recent similar problems being reported in any of those either.

It’d hard to tell from the single line of the log file if the db optimization completed or not.

I’m not an arch nor GNOME nor gdm user, so I can’t comment on those.

1 Like

Having had a test run on a couple of other machines, I am none the wiser.

The midi tower this machine replaced runs on Arch with Gnome as the GUI. DT would not start until I clicked on the choice given to update the database rather than to exit. I ran pacman to update Arch before attempting to start anything.

My laptop also runs Arch but has Xfce as its GUI. It gets updated at the same time as this PC, i.e. every Sunday. The “import” facility runs flawlessly.

I use Rapid Picture Downloader to take images from the camera, but I can’t see that it would have any bearing on this problem. Most of the images I am trying to “import” - enter into the DT library without moving them - have been copied from the old PC including their sidecar files.

My suspicion is turning to a corrupted DT database on this PC. Does anybody here have any pointers as to where I can look to find out how to fix it please?

You should have snapshots of the database saved in the config file. Create copies and rename to .db to test.

1 Like

Thanks @g-man. On listing ~/.config/darktable the list looked a bit complicated so I moved that directory to ~/.config/darktable.orig then opened DT in the GUI interface. It’s all working again.

I need to re-import everything, not being sure which of the files I could safely move back into the newly system-created ~/.config/darktable/ - would that be everything showing as created before 19th March or is there some way of identifying which file caused the problem in the first place?

This is recurring far too often. The only way I have found to recover DT is to delete the .config/darktable directory. So on each occasion I have to manually re-import each reel, one folder at a time. Time consuming, frustrating and no sign of a solution in sight.

Meanwhile, I’d love to continue with DT if possible. There are already data.db and library.db files in .config/darktable. It is not clear when snapshots are taken. Clearly a data file and a snapshot file are taken at the same time, data.db-snp-{unix timestamp} and snapshot.db-snp-{unix timestamp}, the latest entry for each being on 4th June. There is one entry for each in May and two for April.

Other than that, I can’t make much sense of it. Any pointers? I need to process and to revisit raw files far too often to put up with this any longer. A method of backing up which is easy to restore when DT crashes would help.

The files including sidecars are all there. It is linking them back into DT which burns the hours away.

Are you sure your disk in dying or you have bad ram?

What is recurring far too often?

Darktable hangs in lightroom > import > either option

How to delete a post (this is in the wrong place)

You should try to narrow down which file is causing your issues. Is it the configuration file darktablerc or one of the databases (data.db, library.db), or …?

See darktable 4.6 user manual - storage (create database snapshot). For me it works exactly as described.

Thought I’d give an update. Yes, it’s been a long time but I have not been well and it took a lot of time and research.

DT now runs cleanly with OpenCL since this morning. The problem was I misread the guidance in the Arch GPGPU wiki page, which are very clear, my own fault. The crux of how it was done is in this post, which may help anybody intending to run DT on Arch or any other distro using an AMD GPU.

3 Likes

Sorry folks, it’s back to square one. After making the above post I went back in to give the OpenCL version a try. The first film roll opened with some blacked out images and they were all blacked out in Darkroom. So I tried a few more and ditto.

The images are intact but they don’t show in the Darkroom workspace so I can’t do anything with them. Back to running DT with the --disable-opencl flag and even then it is flaky at best.

I really don’t have time for any more pain. Problems running DT on an AMD GPU are splattered all over the web across several distros. AMD are obviously and contrary to their public statements not at all interested in whether their hardware works with Linux.

Without OpenCL DT is the best. Is there a fork which has not jumped the gun by introducing OpenCL long before it is fit for purpose? Other wise I’ll give RT a try.

Thanks,

Mike

Something doesn’t quite add up here…

Indeed it doesn’t. I have images to process and I need to keep processing them whilst looking for another application and testing what I find. Unless it is a direct clone of DT but without the klutz that is DT with OpenCL on an AMD GPU.

The problem started when the new DT release came out tweaked for OpenCL. IIRC that was very early this year or very late last year. No options. The changes were there whether or not the user installed OpenCL. It broke my system and I have lost many hours, adding up to days, trying to sort it all out. Too much time in fact. The Arch devs and support staff don’t know what to do. I can’t see anything coming from DT. AMD support is totally absent so far as I can tell.

It seems indeed to be an AMD problem. But that’s not a problem darktable can solve (nor is it “using openCL before it’s ready”). NVidia works (for me at least) since several years with very few problems (and those were of my own making…)

Second, not many others show problems when running without openCL.

The easiest solution (if that doesn’t break other programs) would be to uninstall the (AMD) openCL libraries, perhaps? Not the drivers, just the oenCL part.

And remove the openCL kernels dt compiled (just to be sure). Those should be under ~/.cache/darktable in directories “cached_kernels…” or “cached_V1_kernels…” (note: ~/.cache is normally hidden!).

As for a DT version without openCL support, you may have to compile that yourself.

1 Like

If only the manual had a section about opencl…

You can start with
darktable --disable-opencl

Or just turn off in preferences

https://darktable-org.github.io/dtdocs/en/special-topics/opencl/still-doesnt-work/

1 Like

Thanks for the response. Correct. The DT devs can do nothing about it. What surprised me was the introduction into the code without first testing on what is one of the widest used GPUs in the world today and has been for years.

Removing rusticl, which made things worse not better, is an immediate priority. That should give me enough to improve things while I look for an alternative.

I’ve become all too familiar with the hidden user cache and config directories in Linux over many years, too. The problem with fiddling with those is that Arch, my distro of choice, is chosen by me precisely because it is a rolling distro and therefore permanently up to date. So my weekly updates would cause them to replace anew at least some of my changes.

@g-man over the past few months I’ve also become all too familiar with the CL flags available for the darktable command.

Thanks for the reminder of the docs pages on github. I’ll give that one a more thorough read when I get time. It’s a while since I first read it and didn’t find much of use then. Once again though, I don’t want to spend too much time keeping software running instead of processing images. I’ve done far too much of that already.

I was under the impression AP had started a branch some time ago. How is that progressing? It may be worth a look if it’s still going. I’ve also seen good reports of RawTherapee so will take a look at that.