What is the best way to import thousands of processing histories into my primary darktable database?

Thanks for the pointer. I had previously had a go at using local copies - but it seems it was intended for a different scenario.

The darktable “local copies” seems to be for a specific case where all files are in a central NAS at all times. Only the processing is distributed. What I think when I first read the darktable manual - was that local copies would work well for studios where after the shoot all originals are on the central NAS. And the various folks responsible for processing the images can then make local copies - take the Laptops away and edit & process as required. When they come back - they can then sync the local copy back to the primary and authoritative darktable database.

What I’ve used “local copies” for - is speeding-up some editing and processing - as the read is then off the local NVMe SSD drive and not across the network.

If I could VPN into home and upload all photos to the NAS even while traveling - I think local copies would work. But with effectively two different libraries of photos while traveling local copies I don’t think would do what is needed.