Darktable hangs in lighttable > import > either option

Thanks @g-man. On listing ~/.config/darktable the list looked a bit complicated so I moved that directory to ~/.config/darktable.orig then opened DT in the GUI interface. It’s all working again.

I need to re-import everything, not being sure which of the files I could safely move back into the newly system-created ~/.config/darktable/ - would that be everything showing as created before 19th March or is there some way of identifying which file caused the problem in the first place?

This is recurring far too often. The only way I have found to recover DT is to delete the .config/darktable directory. So on each occasion I have to manually re-import each reel, one folder at a time. Time consuming, frustrating and no sign of a solution in sight.

Meanwhile, I’d love to continue with DT if possible. There are already data.db and library.db files in .config/darktable. It is not clear when snapshots are taken. Clearly a data file and a snapshot file are taken at the same time, data.db-snp-{unix timestamp} and snapshot.db-snp-{unix timestamp}, the latest entry for each being on 4th June. There is one entry for each in May and two for April.

Other than that, I can’t make much sense of it. Any pointers? I need to process and to revisit raw files far too often to put up with this any longer. A method of backing up which is easy to restore when DT crashes would help.

The files including sidecars are all there. It is linking them back into DT which burns the hours away.

Are you sure your disk in dying or you have bad ram?

What is recurring far too often?

Darktable hangs in lightroom > import > either option

How to delete a post (this is in the wrong place)

You should try to narrow down which file is causing your issues. Is it the configuration file darktablerc or one of the databases (data.db, library.db), or …?

See darktable 4.6 user manual - storage (create database snapshot). For me it works exactly as described.

Thought I’d give an update. Yes, it’s been a long time but I have not been well and it took a lot of time and research.

DT now runs cleanly with OpenCL since this morning. The problem was I misread the guidance in the Arch GPGPU wiki page, which are very clear, my own fault. The crux of how it was done is in this post, which may help anybody intending to run DT on Arch or any other distro using an AMD GPU.

3 Likes

Sorry folks, it’s back to square one. After making the above post I went back in to give the OpenCL version a try. The first film roll opened with some blacked out images and they were all blacked out in Darkroom. So I tried a few more and ditto.

The images are intact but they don’t show in the Darkroom workspace so I can’t do anything with them. Back to running DT with the --disable-opencl flag and even then it is flaky at best.

I really don’t have time for any more pain. Problems running DT on an AMD GPU are splattered all over the web across several distros. AMD are obviously and contrary to their public statements not at all interested in whether their hardware works with Linux.

Without OpenCL DT is the best. Is there a fork which has not jumped the gun by introducing OpenCL long before it is fit for purpose? Other wise I’ll give RT a try.

Thanks,

Mike

Something doesn’t quite add up here…

Indeed it doesn’t. I have images to process and I need to keep processing them whilst looking for another application and testing what I find. Unless it is a direct clone of DT but without the klutz that is DT with OpenCL on an AMD GPU.

The problem started when the new DT release came out tweaked for OpenCL. IIRC that was very early this year or very late last year. No options. The changes were there whether or not the user installed OpenCL. It broke my system and I have lost many hours, adding up to days, trying to sort it all out. Too much time in fact. The Arch devs and support staff don’t know what to do. I can’t see anything coming from DT. AMD support is totally absent so far as I can tell.

It seems indeed to be an AMD problem. But that’s not a problem darktable can solve (nor is it “using openCL before it’s ready”). NVidia works (for me at least) since several years with very few problems (and those were of my own making…)

Second, not many others show problems when running without openCL.

The easiest solution (if that doesn’t break other programs) would be to uninstall the (AMD) openCL libraries, perhaps? Not the drivers, just the oenCL part.

And remove the openCL kernels dt compiled (just to be sure). Those should be under ~/.cache/darktable in directories “cached_kernels…” or “cached_V1_kernels…” (note: ~/.cache is normally hidden!).

As for a DT version without openCL support, you may have to compile that yourself.

1 Like

If only the manual had a section about opencl…

You can start with
darktable --disable-opencl

Or just turn off in preferences

https://darktable-org.github.io/dtdocs/en/special-topics/opencl/still-doesnt-work/

1 Like

Thanks for the response. Correct. The DT devs can do nothing about it. What surprised me was the introduction into the code without first testing on what is one of the widest used GPUs in the world today and has been for years.

Removing rusticl, which made things worse not better, is an immediate priority. That should give me enough to improve things while I look for an alternative.

I’ve become all too familiar with the hidden user cache and config directories in Linux over many years, too. The problem with fiddling with those is that Arch, my distro of choice, is chosen by me precisely because it is a rolling distro and therefore permanently up to date. So my weekly updates would cause them to replace anew at least some of my changes.

@g-man over the past few months I’ve also become all too familiar with the CL flags available for the darktable command.

Thanks for the reminder of the docs pages on github. I’ll give that one a more thorough read when I get time. It’s a while since I first read it and didn’t find much of use then. Once again though, I don’t want to spend too much time keeping software running instead of processing images. I’ve done far too much of that already.

I was under the impression AP had started a branch some time ago. How is that progressing? It may be worth a look if it’s still going. I’ve also seen good reports of RawTherapee so will take a look at that.

What? You are using drivers that are not production ready and then blame the developers for your choice?

If you know the flags or how to disable opencl, why are you asking for a darktable without opencl?

Well, let me give my point of view. OpenCL has been in darktable for many years, so there has nothing been introduced in the time you are mentioning. In the last years we could identify a number of bugs in darktable OpenCL code that where only evident in AMD drivers - but dt bugs, sure.

What got a nuicance the last year or so, the AMD drivers are pretty “flaky” especially some distros seem to be more of a problem than others (just search this forum, for some reason arch and it’s derivates are not the best choice atm).

Also - just as a reminder - in current dt master we have runtime selection of common drivers so you can better test what works.

Last one, in >6years of darktable usage and 4years of dt developing i have had exactly 1week with a malfunctioning system and 1week with a problematic driver. Fedora and nvidia btw

Find him at https://ansel.photos/. But he also uses OpenCL. As others have said, darktable has used OpenCL for a very long time.

You can disable OpenCL in darktable, in your OS, switch to better drivers, or to another app like RawTherapee or ART (both supported via this same forum).

I don’t even see where a specific driver has been mentioned.

Oh well that’s obviously the problem. Alpha drivers :slight_smile: