Darktable Database Backup Best Practice

backup

#1

Unfortunately, I am not doing database backups (including the whole config folder) of darktable too often. I want to improve, but have no good idea how to handle the topic. Doing file system backups (e.g. with rsync) would work of course, but I don’t think this is too efficient for the sqlite part. For the latter, google did not tell me the best practice so far. Therefore, I wonder how you are doing this, and how you handle the automatization. Especially, do you define a threshold (and how) of database changes before a backup is triggered, or do you just use timed backups? Do you detect if the database is detached from darktable during backup? And how do you do the actual backup? Incrementally?

P.S.: I regularly backup the sidecar files and have them versioned in a git repository as well, but some features rely on database only, and the amount of pictures approaches the critical mass quickly …


(Mica) #2

I have a rolling backup of my whole home folder, which includes darktable configs.

Additionally when I see there has been an application update, I make manual backup.


#3

Hm, I do full backups of my home folder as well, incremental rsync to external hdd, but I think it would not be bad to have something that happens more often, probably triggered by the amount of changes done rather than an arbitrary time schedule.


(ಚಿರಾಗ್ ನಟರಾಜ್) #4

Maybe set up your darktable config folder as a git repository (git init; git add *; git commit -a should be enough for an initial run) and then have a recurring job which runs git commit? If you’re paranoid about losing the database/config, you could also push it to a private gitlab repo.