What is the best way to import thousands of processing histories into my primary darktable database?

The issue is when I go on extended trips and do some editing & processing on a different computer (e.g. Laptop) to that of the home desktop. How best to merge the Laptop darktable database into the home desktop darktable database when I get back home?

Earlier this month had a few weeks traveling and took about ~3,000 photos. During evenings, or during downtime or while waiting for transport I processed some selected photos on a Laptop. So over the fortnight I probably edited and processed about half of the images. Because of space on the Laptop I only copy the specific photos I work on - onto the Laptop. All other photos remain on the SD cards.

When I got back home - I could follow my normal process where the NAS directly copies the photos from the camera’s SD cards to the NAS share. And use the “add to library” to get the new photos recognized by the home desktop primary authoritative darktable database.

However, I couldn’t find an expedient way to import the processing history of those Laptop processed images back into the primary & authoritative home desktop darktable database. The only way I found was to directly write the XMP sidecars and then individually one-by-one “load the sidecar file” to bring the processing history into the home desktop darktable database. To do over a thousand individual loads took many hours over multiple days.

Next trip might be longer with thousands more photos to try to import - maybe import would take weeks when done individually one-by-one?

Is there a better way to do this? Importing image processing history from the Laptop to the desktop?

Is there a way to actually merge the databases directly as from google it seems darktable uses sqlite3? Is there a script that someone may have that already does this?

Thanks

Sorry so you have a folder of images with xmp from your laptop… can you not just copy these files to your main image collection and import them…just let it run at night or something?? Am I missing something?? It should go fairly fast if your hardware is not too limiting…

Thanks for the pointer. I had previously had a go at using local copies - but it seems it was intended for a different scenario.

The darktable “local copies” seems to be for a specific case where all files are in a central NAS at all times. Only the processing is distributed. What I think when I first read the darktable manual - was that local copies would work well for studios where after the shoot all originals are on the central NAS. And the various folks responsible for processing the images can then make local copies - take the Laptops away and edit & process as required. When they come back - they can then sync the local copy back to the primary and authoritative darktable database.

What I’ve used “local copies” for - is speeding-up some editing and processing - as the read is then off the local NVMe SSD drive and not across the network.

If I could VPN into home and upload all photos to the NAS even while traveling - I think local copies would work. But with effectively two different libraries of photos while traveling local copies I don’t think would do what is needed.

I think I also misunderstood you …you have say 3000 pics on sd cards… you are selectively editing some of these on the laptop but not all… so I am thinking if you get home and as long as you import the ones from the laptop first and dont’ allow any duplicates when you update from the sd cards won’t you be fine??

Yes - that is on the Laptop for only the photos edited & processed while traveling.

All the photos taken during the trip are then downloaded directly from the SD cards by the home NAS to the NAS share when I get back home after the trip. There are no xmp files on the camera’s SD cards.

The xmp files are on the Laptop - not on the NAS as the photo files from the Laptop are not copied to the NAS. I can copy the xmp files to the desktop…

There is a darktable config option of “look for updated xmp files on startup” - but it seems that it expects the xmp files to be in the same dir as the raw photos. And that the timestamp has changed.

I’ve played around with a local filmroll on the desktop and from what I could make out - darktable expected that the xmp already exists and only brings in the new edit & processing history if the xmp file timestamp is newer than the database timestamp. This would not be the case - as the editing & processing on the Laptop would have been a few weeks before the photos were added to the home desktop darktable database. So the desktop database is by timestamp the latest.

This is something I will have to try.

So would you not be safe to just be sure to do the laptop images first… your main database has never seen them and the xmp are with them so the edits get imported …all good with those… now can you not set it so that your NAS import will not import any duplicate files…so then all the laptop files will not be re-imported or are you changing the names when you import…even then wouldn’t you be okay…

1 Like

No I don’t change file names

The import is just an import into the database. I don’t use darktable to copy images around (at least not the initial copy to the NAS).

I’ll need to take some photos tomorrow and see…

Ya in my head I was thinking your moving of the images into DT was two steps… Moving them off the SD cards and then importing using DT import. I know dt can pull from card but I never use that so there might be a nuance there I am missing if you do that but I think if you were to do the laptop images first just to be safe and then import either directly or in 2 steps with DT set only to import new files it would simply ignore the laptop files as already being in the database… DT is a bit weird possibly as you have to look at the bottom and what is selected for import doesn’t actually look like its selected but rather images already imported sort of look like they are the ones selected…but this setting should then just import the non-edited ones I think… a bit of experimenting to confirm should sort it for you…

image

Yes - this is what I do.

Step one - I have a script which I run on the NAS which copies (rsync’s actually) the photos from the SD card(s) to the NAS.
Step two - I then point darktable at the filmroll that contains the NAS folder and select “import-add to library”. Then make sure that “select only new pictures”, “recursive directory” and “ignore jpeg images” are selected and then click “add to library”.

I do this for both the short trips and the long trips. This gets the photos onto the NAS and recognized by darktable.

I will have to test tomorrow importing from the Laptop first and then the SD cards and see if that is a faster way to do things regarding getting the editing & processing history recognized for the photos which have been processed on the Laptop.

Thanks for the suggestions.

1 Like

If you can stand a pretty technical solution, I check my xmp files into git and my raw files into git annex. I then use the facilities of git to move things around.

If you need a nontechnical solution, then:

  1. Set darktable to look for new xmp files on startup
  2. Import raw files into darktable.
  3. Close darktable
  4. Copy the xmp files from your laptop
  5. Reopen darktable and it should give you a prompt about the new xmp files.
1 Like

it seems you have to merge two directories before importing into the destination darktable database.

  1. directory/ies: you copy images from your camera to your laptop for an initial edit with darktable, so that directory/ies contains also xmp files with the history stack
  2. directory/ies: you copy the images from the camera/sd cards to their final destination on your nas

If the directory structure doesn’t differ, you can use rsync or some gui tools (e.g. beyond compare) to copy the xmp files from 1. to 2.

Importing into primary darktable from 2. should be done after the xmps are copied.

If you already imported then you need to update the file date for your laptop xmps before copying so new xmp files can be detected on next darktable start.

3 Likes

Yes - any solution - technical is fine. Scripting & git also fine.
Though to be honest - apart from pulling source from git from time-to-time; haven’t spent a lot of time with git. In a previous life used CVS and still do for linux config. I’ll read-up on git.