Centralized XMP database?

I have my photo library backed up to several external harddrives (independent copies). When editing a file in darktable, an XMP sidecar file is written. However, since I use different harddrives at different times, the different harddrives have different edits. I would like to have some way to sync edits between these harddrives so that if I edit a file on harddrive A, and then I load identical file from harddrive C, I can see those edits. One way would be to write the edits to XMP and then copy them to each harddrive. The main shortcoming with this method is that it doesn’t play very well with local copies.

What I would like to figure out is a way to coordinate the edits across the different harddrives. One idea that came to mind was to create a master XMP repository. A sidecar for each edited raw would exist in a directory with a hash of the file it belongs to. When loading a file in darktable, hash the raw file and then lookup the sidecar and apply that sidecar (if it exists) to the raw file imported into darktable.
When seeing how to load XMP sidecars into darktable, I see that darktable allows me to load an XMP sidecar file to apply to selected images. This would take a single xmp file and apply it to any file selected. Doing this for more than just a few raw files would be quite tedious. I thought that perhaps I could solve the problem via lua scripting. However, it seems that the lua api has no mechanism to load and apply an xmp to a file in darktable.

Does the lua api have the ability load and apply an xmp to a raw file? Then I could write a lua script to map the file to sidecar and load the appropriate sidecar.

You should take a look at git-annex.

1 Like

Do you use git-annex? Could you share your workflow?

I do. I add all raw files into the annex and xmp files into regular git.

I have editing sessions and I git add all xmps and sync up everything after a session.

I had come across this a number of years ago but I have never gotten around to trying it out…

Here is one account of a person using it…

https://tylercipriani.com/blog/2016/09/28/git-annex-metadata-filtered-views/

i don’t generally rebase my photos repo as that article suggests. Rather, my very standard workflow is:

  1. Plugin SD card to computer and run a script that uses exiftool to move all photos off the SD card to a folder where the files are renamed and put into folders by date.
  2. cd /path/to/photos/folder
  3. git annex add *.nef
  4. git commit -m "Photos from today".
  5. git annex sync --content will sync my content to all git remotes (external disks and NAS)
  6. Cull and edit in darktable
  7. Back on the terminal, git add *.xmp
  8. git commit -m "edits for the day" to commit all the xmp files to git
  9. git annex sync --content will sync up those xmps with my git remotes
3 Likes

Hi,

is git annex working nicely with Windows/NTFS? Right now I am using syncthing to sync files across Win, Linux and Mac, but having a central git repo could have some advantages.

Best regards
Till

You can see here: Windows

Thanks for sharing your workflow. I can see where this could be a different, and perhaps superior, approach to what I was trying to do.

How are your external disk remotes configured? Are they simply directory special remotes?

All my current remotes are external disks that I mount locally. I used to have special remotes for some s3 buckets, but stopped that in favor of a restic backup because the deduplication of restic saves me a lot of space and I am paying per GB for the s3 backup.