I am new here and also relatively new to the Darktable.
With Darktable, I am working in parallel with two computers:
PC/Windows: Darktable 4.8.1 to work “at home” and
Macbook M3 Pro: Darktable 4.8.0 as “on the way” device
I hold my photos on an external 2TB SSD and plug this SSD to one or another computer when I need.
The problem is the SYNCHRONISATION of the LATEST STATUS I worked on. In the Darktable, I went to preferences > storage > look for updated XMP files on startup… there, I sat an “OK”.
After I have worked on the MAC on my files e.g. during a trip, then I want to proceed WITH THE SAME CHANGES further on my home PC. I plug in the SSD and get 4 possibilities to synchronise the changes in my workflow:
Keep the XMP edit
Keep the database edit
Keep the newest edit
Keep the oldest edit
Let’s skip the last one/oldest edit. I want the most recent edits.
So, my question is: If I want to keep Darktable edits (like contrast, crop, colour, lens correction, etc.), but also my photo rating in stars and colours, metadata changes, but also if I have duplicated or deleted some edits/photos in Darktable (there will be then more or less XMP-files near my DNG if I want to make few versions of one photo) … so well, I want to have ALL LAST CHANGES transferred into the other computer/Darktable.
I have got a hint from a dear YouTuber, but still cannot synchronise ALL EDITS – no matter if I chose: Keep the XMP edit, Keep the database edit or Keep the newest edit.
I get/synchronise the last changes, but duplicates or deleted duplicates or photo evaluation are not always synchronised. I can than “remove the film roll” and again “add to library” this filmroll again. This action brings all edits into the Darktable, but I cannot believe this is the way of synchronising the libraries.
You could use tools to synchronise deletions etc.
The official way to work with darktable on the go is using local copies. I have never used it, and have no idea if e.g. file deletions are synced or not.
I did it on 3 windows machines for a little while and it worked I used a local folder that was sync’d using Onedrive. I used symbolic links just to be safe and then ran DT on all the machines with the the --configdir and pointed it to that folder using the symbolic link…
If worked well for the most part but I found that Onedrive was lazy and not always up to date and so I would get some conflicts…
I guess if you used some sort of cloud sync and could trust it better or write a script to force the sync before running DT you might be able to do something like this if you did work having internet access …
Please forgive me, I am going to mention the LR word here. LR had the ability to store a catalog where ever you requested. I presume the catalog is the equivalent of DT’s database. So with LR I could store my catalog on the external hard drive and then jumping from one computer to the next was not a problem as the catalog was on the external hard drive. I wonder if there is a practical way to replicate this behaviour with DT’s design. I for one would find this convenient and practical.
Exactly! This is the point.
I have a single SSD and can move it where I want, am not depending on the internet connection and (theoretically) can be up-to-date in every moment.
Just, these Synchronising is not working still for me. But, I think, I am making some mistake, and this is probably a simple to solve.
Maybe somebody is working in the same way as I do and knows how to do this.
Not darktable, but I have used Godot (a game engine, very sensitive to file changes) across multiple computers, first I placed the project folders in my onedrive but this would always cause problems. (Onedrive being lazy, creating duplicates of files with names like filename-LAPTOP.jpg, while the original was just filename.jpg) Onedrive also got immensely slow with a lot of files, but I do not think that applies here.
At some point I discovered Syncthing, a tool for syncing files between computers in real time where having the files synced correctly is really important. It works perfectly for me, but it has a few downsides:
As it is peer-to-peer, the two devices in-between which the data should be synced, have to be online at the same time. You can mitigate this fact by running syncthing on a server like a nas. (It runs on almost anything.)
Unlike with onedrive you can not have only a part downloaded. You want to sync? You have to download everything. Everywhere.
As for using this with darktable, I haven’t tried it yet, but in syncthing you can choose any folder, so even some folder hidden in window its appdata can be synced. So I guess you could sync the folder in which the darktable database resides.
Edit: When testing this out, no data loss is not guaranteed. Please always make backups. And you have to know Syncthing is NOT a backup, just like how raid in servers in not a backup.
The database keeps track of what is in there, so it knows what has been added and removed. Whatever is presently in the database are the XMPs that are looked for at startup. So if you create a new XMP on the other machine, it won’t be picked up unless you manually add it. Same goes for duplicates and removals.
The database also stores the full path to the image files, thus if you mounted everything in the same location, then putting your database on the external drive would probably work, if you invoked dt to read the db from your custom location. But that won’t work with two different OS when one of the OS’s is Windows, since again, the db stores the full path to the image, the location will never be the same if one OS is windows and the other linux/macos.
This comes up when there is a difference between the XMP and the database edit. So you’re choosing to keep whatever edit is in the database. If your xmp was updated outside the databbase (like from another dt install) then you’re choosing to throw away the edit in that XMP file.
Well as I understand, the best option for me would be to keep both: database and last edits in XMP. I was believing the “Keep database edit” will take the last edit in the database (added and deleted XMPs), but will also refer to the last version of XMP and their changes.
There is no possible way to keep both. If you choose xmp, the xmp is read and inserted into the database. If you choose database, darktable will overwrite the xmp.
I understand.
I am thinking the developer team may think about it to add another further option to offer in one of the next upgrades or updates?
So, If I have changed too much on my database and XMPs, what remains for me is to remove the OLD film roll and add the same NEW film roll again into the Darktable when I move to the next computer?
It’s a little “awkward”, but probably the only solution for now?
Another question: How to contact somebody from the Darktable development? Is there any link, email?
Not so much a matter of “too much” as of the kind of changes: adding (or removing?) files isn’t seen by the startup check.
One option would be not to use a database at all, and start darktable with the --library :memory: option. Downside is that you have to import files you want to work on every time, and cannot do any searches in your collection.
And that still leaves you with a possible discrepancy concerning presets and styles…