For my workflow, it would be ideally to have entire DT database, image+sidecar files as well as all config stored on an encrypted 2TB external drive, as I could conveniently mount them on my 2 or 3 MacOS machines and use when and as I need it. The addition, it would be super easy to back it up and store off site periodically.
Putting the database on anything other than an internal ssd or faster will likely lead to darktable feeling exceptionally slow. Files and sidecar files on an external is fine. You should look at darktable’s Copy Locally feature.
Another software which I dare not mention here allowed me to store the DB on each external hard drive I had and when I attached the hard drive to my computers I could then ask the program to open the database located on that drive. This was a very useful feature for my purposes. I too would like to be able to do this with DT but have not worked out how. I feel it would require an option in the lighttable to open a specified DB. Maybe this can already be done and I just don’t know how. I am not willing to put in a feature request for this feature because if the developers thought it was a good idea I am sure they would have already implemented it. I mainly run Windows if that relevant.
Some time ago I made a portable version for windows. I installed DT on my local machine and then added a config folder to the installed directory… THis was on Windows… THen I ran a little batch file that just ran DT pointing to the config… by doing this way you could unzip and run it from anywhere including a USB drive… I suspect you can do the same on Mac… THe key was to add the config folder and then run DT using a batch with the --configdir parameter and point it to that local config file…
Yes, there are command-line switches to set the locations of the files and directories that darktable uses:
--configdir <config directory>
Define the directory where darktable stores user-specific configuration. The default location is $HOME/.config/darktable/.
This way, you can put both databases (library.db and data.db), as well as the configuration file, on the external disk.
Any LUTs and watermarks would also need to be moved, and you’d have to make sure the external disk is mounted under the same path on all machines, otherwise the image paths stored library.db, and the paths to LUTs and watermarks would become inaccessible. But be warned, the same config may not work on all machines (e.g. different screen resolution if you configure the user CSS or the font size, different resources).
Another option is to only move the library.db, but then your module presets, styles, tags and locations (which are stored in data.db) would not be shared:
--library <library file>
darktable keeps image information in an sqlite database for fast access. The default location of that database file is file name library.db in the directory specified by --configdir or defaulted to $HOME/.config/darktable/. Use this option to provide an alternative location […]
There is no such configuration option for data.db (--datadir is not the right option).
Thx all for the info. So I tested --configdir and it works just fine. I was wondering if we could keep cache local by setting --cachedir though for improved performance.
Now, one thing I could not find after extensive search is how to permanently replace default $HOME/Pictures/Darktable in lighttable->import->‘copy & import’ , so I do not have to worry when importing every time
However, I have been unable to find how GTK resolves those. I remember I read something about this on the forum quite recently (someone had to override some variable), but cannot find it.