Ideas for "Darktable" Cloud and Sync (local and over the internet)

So, I’m having a growing need for a solution that will sync selected raw files and sidecars between devices.
Basically I switch constantly between my main desktop machine and my laptop at home and on the same wifi.
It’s very much a pain to have to always transfer my raw files and sidecars to the USB sticks etc.
I know I can do it over Samba but it’s also a manual process of finding the raw and a sidecar, then transferring it, then putting it in the correct directory on the second machine etc.

Basically, I’m making this topic for all of us to collect our thoughts, share our experiences, try to find out about existing solutions and/or possibly develop something that would serve that purpose. But the main thing is to see if there is even a need for a solution to this problem.

As some of you might know I mainly use Darktable on Linux but the same could apply for any other raw editor.

In my case, I’m always thinking about how could I connect Darktable with Nextcloud. Maybe a lua script and a Nextcloud plugin.
But then again I’m still a long way from developing a lua addon for Darktable.

  1. Have a central repo, and use the “sync local” feature of darktable

  2. Put everything on an external drive, including the cache and database files. Make a shell script to start darktable and point it at the external drive.

  3. Use a CLI solution such as rsync/unison or a more friendly program like syncthing to sync files and sidecars around. Use soemthing like s3cmd.or rclone to extend these to cloud storage.

  4. I use git for my sidecar files and git annex for the raw files, then sync things up using a sneaker net external drive. There is already a lua script for git annex (but I don’t use it).

2 Likes

How do you plan to manage the differences between the desktop’s monitor and the laptop’s screen?

@paperdigits that’s a pretty neat solution! I will try to test that and see if I can integrate it in my workflow :smiley:

@elGordo well, 1. I’ll buy a better laptop since my current one is 50% sRGB. As for the color management, I’m proud to say that I finally have a spectrometer since a few days ago. Too bad I only used it to tell me that I need a better screen :crazy_face: :crazy_face:

1 Like

I set up a Nextcloud system at home, with an old laptop running Linux and a couple of USB drives configured as a RAID1 disk array (to protect against a disk failure).

I have all my laptops sync up to that Nextcloud instance, and I can select which directories I want to sync to each laptop, in case it doesn’t have enough space to hold my whole library.

Pros: each laptop has a local copy of the files, so darktable runs very quickly
Cons: no offsite backup, editing files on one machine means that the other machines will need to update their local darktable DB with new information from the XMP files. Unfortunately darktable 3.2.1 is buggy in this regard

To deal with the offsite backup issue, I’ve tried some different approaches:

  • set up a Nextcloud instance in AWS. This is quite expensive in terms of data transfer costs, and possibly S3 API calls if you don’t set up your nextcloud correctly.
  • set up a AWS S3 File gateway, which provides a caching NFS service in my home, backed by the files stored in S3 object store. It means my laptops access the images over an NFS share, so performance is not as good as having local files.
  • use Backblaze to backup the local drive of one of my laptops, which contains my complete raw library. Cheap, but if I need to do a restore I’ll need to send away to get the files dumped to a USB drive.
1 Like

I use either of sftp and sshfs to quickly transfer files among my Linux hosts. Samba is usually a nightmare.

I prefer rsync for that.

1 Like

I got my RAW files with their XMP (I don’t use a database) in a folder that is overwatched by Resilio Sync which synchronises the files with a second computer. I could edit them on the second computer if I want and Resilio would sync the edits back to my first machine.

I have darktable installed on an external USB 1Tb HDD and also all my RAWs on the same HDD. Works fine, you just need to start it with a specific BAT script to have the db also on the external drive.

I do frequent backups, so my slowly dying HDD isn’t a big issue.

You are still using the DT database it syncs with the xmp files each time you open it no?? Or at least when you import a folder or file?? I don’t think you can totally get around using it…library.db for images and data.db for the rest??..Maybe I am missing something…I have my library and data db on onedrive and I use a sybolic link as onedrive local folders are not in the same location on each pc…I use -dirconfig to point DT there. I have to be sure I let onedrive sync if I am moving to a new computer …usually I am back and forth but it works for the most part. I do this will image and xmp as well keeping what I can …I have a TB so I can keep enough of my recent stuff in sync and available. At home I back up the local onedrive folder to a NAS at 4 Am…Really I just need to be sure to let onedrive sync and it works otherwise it can cause a crash from time to time…

<shamelessSelfPromotion>
Did you check out Syncthing? It has the same use case and is free and open source :slight_smile:
</shamelessSelfPromotion>

1 Like

Yes, I know. Syncthing is still not sophisticated enough for me, if it was I would switch today. At least I can use Resilio without a fee.

You can start darktable with this command:
darktable --library :memory:
Then darktable creates the image library in RAM which gets discarded when you exit.

Ah yes I had tried that in the past but it seemed to slow down my experience and I had crashes so abandoned that

Same problem here. I have a >3TB archive of photos which is obviously too much for a laptop host. However, I frequently use my laptop to work with images. What I really wish for, is a cloud-hosted database on my NAS that I can use to sync photos and a library on-demand. Essentially a cloud storage for Darktable that is synced among devices. A solution, where I can decide on the folders I want to work on locally and the changes then get uploaded later on.

It could be something similar to Zotero’s self-hosted feature over webdav.

I do just this at home, simply with a desktop computer and a SMB shared directory. And, I didn’t plan this out, it just happens to work that way…

A long time ago, I decided to keep all our digital pictures on my Ubuntu desktop computer. I made a directory called Pictures, and I store them there by date, with subdirectories for each year and directories within those for each “event”, be it a vacation, birthday, whatever. Knowing that backing up is important, I made a mirror of that Pictures directory on a Ubuntu pc I’d built for the living room TV, and used rsync to keep the two in synchronization.

Recently, I’ve been using a Windows tablet to do a lot of dev work. But I usually reserved doing work on images to sitting at the Ubuntu desktop, because that’s where the images were. On vacations, I’d make proofs of my daily shooting on the tablet, and then copy all that to the desktop when I got home. So not too long ago, wanting to work a particular image on the tablet, I realized I’d also long ago opened the Ubuntu home directory with a Samba share. I opened Windows Explorer, pointed the path to the network share, \\caliente.local\glenn, provided the requested password, and Poof! Magic!, there was my desktop home directory in all it’s messy glory. I drilled down into Pictures, found the image, and opened it on my tablet just as if it were in a local directory. Well, a bit slower, but opened locally.

I know, a long story for a simple thing, but that’s the deal: I don’t think it has to be a cloud-thing; the tools are already on the computers, if you spend a bit of time to configure them…

I use Nextcloud. I have an old laptop with a couple of drive in a RAID1 setup, and this allows me to share photos between different laptops. The beauty of that is the photos are stored locally on each machine for good performance, and I can be selective about what I want to sync.

I also set up a Nextcloud instance in AWS, using S3 as an external storage, but this is quite expensive, so I normally just use it as an off-site safety backup. It’s also good when travelling, but haven’t done much of that this year…

Because I have a very similar setup (powerful desktop for >80% my editing) and a laptop I started to test the following setup (using DT 3.2.1 on Ubuntu):

  • Created symlinks from my local picture folders (e.g. in /media/$USER/some_drive/images) to $HOME/Pictures on both PC & laptop
  • Put library.db and darktablerc on my Nextcloud instance for sync, this has the nice property that those files are backed up in my regular Nextcloud backup and there are even several versions backed up
  • Created symlinks from the local Nextcloud copies of darktablerc and library.db into their original destination in $HOME/.config/darktable

This has been working well, because image paths are the same on my PC & laptop, hence database entries work on both machines and I immediately see if files are not (yet) copied on my laptop, when their folder appears striken through in the lighttable folder view.
As I never run DT on my desktop & laptop in parallel this should not lead to concurrent changes to the library.
However, I frequently get a warning that the image metadata between sidecar files and database differ by their date, but those warnings can be dealt with quite easily.
Let’s see how well the sync goes once library.db is getting bigger, right now it only is ~16MB in size.

Syncing the SQLite library.db had me think as well on a decentralized SQL database, running on a NAS / server instance as an alternative to the file based SQLite db.

it will work more smoothly in darktable 3.4, where they cleaned up how timestamps are updated in the XMPs.

1 Like