Darktable Database Backup Best Practice

Unfortunately, I am not doing database backups (including the whole config folder) of darktable too often. I want to improve, but have no good idea how to handle the topic. Doing file system backups (e.g. with rsync) would work of course, but I don’t think this is too efficient for the sqlite part. For the latter, google did not tell me the best practice so far. Therefore, I wonder how you are doing this, and how you handle the automatization. Especially, do you define a threshold (and how) of database changes before a backup is triggered, or do you just use timed backups? Do you detect if the database is detached from darktable during backup? And how do you do the actual backup? Incrementally?

P.S.: I regularly backup the sidecar files and have them versioned in a git repository as well, but some features rely on database only, and the amount of pictures approaches the critical mass quickly …

I have a rolling backup of my whole home folder, which includes darktable configs.

Additionally when I see there has been an application update, I make manual backup.

1 Like

Hm, I do full backups of my home folder as well, incremental rsync to external hdd, but I think it would not be bad to have something that happens more often, probably triggered by the amount of changes done rather than an arbitrary time schedule.

1 Like

Maybe set up your darktable config folder as a git repository (git init; git add *; git commit -a should be enough for an initial run) and then have a recurring job which runs git commit? If you’re paranoid about losing the database/config, you could also push it to a private gitlab repo.

And how we do wih window?

Pretty much the same thing, @joccha. Just the database location is different.

Is there much value in backing up more than the sidecar files?

Yes, data.db has all your presets and what not.

2 Likes

With window it is different as I can see, data db in not in program files

@joccha Type %LOCALAPPDATA%\darktable into File Explorer. That works in Windows 10.

Thanks afre
I save my pictures to an outside hard drive from my main computer

If my Laptop has the same version of Darktable than my main computer May I copy the db of my main computer to my laptop and both machines will be alike

If yes do I have to remove the existing db from the laptop first?
This for WINDOW 10 /64 bits

This is what I am doing all the times:

  • sync the image folders (images and xmp) between home and mobile computer
  • sync the mentioned profile folder between the two
  • optional: sync the cache folder between the two

In your case:

  • step 1 may be obsolete as you have those files on an external drive
  • but then: the path to the images on the external drive must be the same on both letters (in windows: same drive letter etc.)

… and as you mentioned … exact same version of dt on both computers …

The topic is old, but can I recommend strongly symbolic and junction links for convenient backups. I am using this under windows.

A junction or symbolic link to the darktable config directory is placed in one of my standard backup source directories. From this source I am making manual and automated and versioned backups to multiple target backup directories (USB, NAS, other PCs …). With the junction/symbolic links it is easy to “collect” various important directories and files of various software tools (not only DT), which are distributed on different windows subdirectories.

Here is a good explanation about the various links:
difference-between-ntfs-junction-points-and-symbolic-links

Good backup tools allow the enabling/disabling of junction and symbolic links.

I am also using heavily the DOS subst command to map different subdirectories to drive letters. That is a good method for flexible file locations with constant drive letters. A different topic, but also quite useful for defined well backup strategies.

1 Like

DT does have also its own means to direct most if not all those files… I do something similar ie syncing that directory but I just used the DT parameters to point to folders I created…

https://docs.darktable.org/usermanual/4.0/en/special-topics/program-invocation/darktable/

Came across this thread and wanted to share my methodology.

I use ubuntu as my OS of choice, which includes the nice “dar” backup tool. Think of dar as a supercharged tar.

I have built some tooling around dar:

  1. Full & differential backups of both my $HOME and media files (details on my dar-backup github project)
  2. On every login a script does a backup of $HOME/.config/darktable/ to a cloud service (details on a github gist)

With this in place I believe I can restore darktable config, when something goes very bad, as it inevitably does.

My 2 cents on this :slight_smile:

Thanks for sharing, I was not aware of dar.

A couple of months ago I changed half of my backup toolchain from rsync incremental to borg, which was eye opening. The chunk-wise deduplication makes a tremendous difference and I am convinced that this is how backup should work. As I can always mount and check my backups with a fuse file system, I don’t feel the need anymore that the current state of my files should be accessible in the backup (as it was with rsync). I tested this with more than >20 TB of backup data (8 TB with rsync incremental, 2 TB deduplicated) and it worked flawless (comparing every bit between original files and backup). A big benefit of borg that it is very simple to get old backups into borg and also to get stuff out of the chunk store again.

The other half of my strategy is still git-annex (for media where chunk-wise deduplication is less important).

2 Likes

I just stick my whole git-annex repo in my backup tool, which is restic in this case, bit Borg is similar.

1 Like

Should DT purge old Data and Library snaps automatically (likre keeping just 2 of each for example) , as keeping them all will make ~/.config/darktable/ grow in size indefinitely ?

I’ve just checked mine seeing this thread and it’s a whole gb already :slight_smile:

That’s what I would do if I had to start from scratch or had to change disks. But as the disks are way more than 50% full, I cannot easily convert the media backup to borg.

Also, I am currently using czkawka to compare my full sd cards with the remote backup to ensure I have the photographs backed up before removing them from the SD card. This is a bit more complicated and probably slower with borg fuse mount (but not impossible). Other methods may be possible, such as adding the contents of the card entirely to the borg repo, comparing if there are files with the exact same chunk list, and removing these files from both the card and the snapshot. But I would probably stick to option 1.

Interesting tool. Thanks for the hint.

For darktable, or to be more precise, any application configuration, I go the complete opposite way:

I don’t backup configuration.

I have systems in place to recreate the configuration that is needed.

I can go from blank disk to production system with current project data available where it should be in about two hours. And yes, this is tested. Not everything in every application will be totally perfect, but it will work. And that is what I care about. The tuning for the pretty things is a constant game anyway.

For me this is a much more transparent approach.