How well do darktable and git-annex integrate?

I am running out of disk space on my mobile computer, and the industry fails to deliver 9.5 mm 3.5 in hard disks with more than 2 TB. Therefore I wonder if git-annex is the solution to get rid of all the 1 star images and keep them on the NAS only. However, since git-annex will keep links to images that would not be available while travelling, I wonder if there will be issues with darktable, when I want to keep all images generally accessible in darktable. Of course I could test with some test images, but maybe there are issues especially with large collections or other circumstances that I do not know. Therefore, I ask for any advice, help or any experience with real life use.

If you choose to use git-annex and a file is not present, you’ll have the thumbnail in darktable, but when you click to open it, it won’t find the photo.

I love git-annex and it is working well for me, but another alternative is to just to darktable’s local sync feature: https://www.darktable.org/usermanual/en/local_copies.html

1 Like

I have a tendancy to distro-hop every now and then, and I also want the location where my photos are to be running 24/7 for backup syncing. The device I use for running darktable used to be a laptop, and is now a self-built tower which is mostly on. I have all my pics on my NAS machine. Yes, sometimes I local sync, but rarely. i find 2 GIgabit connections plugged into the same switch give me good performance.

My NAS is a Tonidoplug (think pogoplug) running Arch-Arm with a USB2 conected 2TB HDD. 15,000 RAW files. A plain-vanilla CIFS share is mounted as a subdirectory of my home folder on my processing machine.

The only caveat I have found (in addition to that which Mica raised) is that thumbnail generation is slow. I have all my thumbnails cached at 5-across zoom level, and I very seldom use a different zoom level.

your mileage may vary …

I use git-annex with darktable and it generally works well. One gotcha is that if you git annex add the sidecar files (.xmp) you will have trouble if you reimport images. The problem, if I understand things correctly, is that Darktable normally writes its information to the XMP files, but if it can’t write it doesn’t warn you and then when it reloads the file, that will overwrite the database settings. I added this configuration to the .gitattributes file, to make sure this problem doesn’t occur:

*.xmp annex.largefiles=nothing

I add files into git-annex before I start working on them in Darktable. That way I feel safer deleting and processing images in DT: I know that I can always rollback in git-annex. Even if a file is deleted, it’s actually still present in git-annex. Every once in a while, I run git annex unused to look at how much crap I have removed and clean it up with git annex dropunused.

I generally sync the entire collection to all my devices. Failure to do so will mean thumbnails will fail to load in Darktable, for example. I would love it if Darktable could dynamically load/drop files from the collection. There’s a plugin that helps with that, but I haven’t tried it out.

Finally, the biggest problem I have syncing Darktable data around is the database itself is outside of the photo directory: that doesn’t get sync’d, naturally. Since most data can be copied in the XMP files, so far it would generally work anyways, but I would have to copy settings between my different Darktable instances, for example. Of course, the same version of DT needs to be running everywhere otherwise new plugins from one instance will not load if missing from the other.

Overall, I’d say it works fairly well, but integration could be improved, especially with “local file copies” which could use git-annex to sync files around.

I don’t sync my database with git annex (but I should!), but DT will follow a symlink for the database. On my edit machine, the symlink points to the file on my mirrored ZFS drives.

my experience syncing database-like content with git-annex (specifically the calibre database) hasn’t been great so far, mainly because of problems with the “read-only symlink” workflow. the new “unlocked files” mode does help quite a bit however…

If know little of sqlite, but you could check it directly into git.