Working with two libraries / or other approach

Hi Everyone,
in my workflow I usually import the raw files onto my laptop (with limited space). Then I apply my darktable workflow and make .jpg export of the pictures I want to have available for quick access. These .jpgs are copied to a specific folder which is looked at with Shotwell.
Finally, the raw files with the sidecars are moved to my server and backups are created.
Problem: When I move the pictures away from the original import location darktable shows the skull which is all right. What I would like to have is a second instance of the library which I would use to import all pictures on my server. The server is usually not turned on - so working always with the server is not a suitable option. Besides working locally is much faster.
Looking forward to reading your proposals.
Daniel

Given that you are moving the raw files and the sidecars, what is the purpose of the second library database on the server?

Note that you also have the facility, within darktable, to ‘move’ files to another location, which could be your server. Assuming you have your server mapped to a consistent location from your laptop, you then only need one library database, which references some files on your laptop and some on your server.

I will try that move command … I was not aware of it. Sounds like exactly what I was looking for. Thanks a lot.

I have tried the move command - it works as indented. The problem is: it does not copy my folders and the substructure I usually create. My .jpg files are usually created in a subfolder called “export” which is not dublicated when I move the files.
Does anyone have advice on a good workflow in this regard?

All the move command will do is move your raws and sidecar files to another location. It will not do anything with your exported files as they are not part of the darktable library. Nothing stopping you copying them across manually though.

If you do this your process would be

  • Manually create target directory structure on server
  • Manually copy(/move?) the ‘export’ folder and its contents to the created directory
  • Move the raw/xmp files using the darktable ‘move’ command

Personally, for me, the raw/sidecar files are sufficient so if I’m moving things to a server, I’ll just move the entire folder structure and not worry about the darktable library (since the sidecar files contain all the info you need anyway).

What you mention is exactly my workflow so far. But with this approach I can never search any picture within the darktable library which is kind of nice with the new time bar and everything.
Anyway, many thanks for these clarifications.

You can also duplicate your folder structure by placeholders when exporting…I have my raw files in D:\Photos and I export to D:\PhotoEdits…using this syntax …it exports creating a mirror folder structure in the export directory…you could of course tweak this for your server…its not a library solution but it does keep your files ordered the same as the originals… ""(FILE_FOLDER/otos/otoEdits)/(FILE_NAME)-string The string can be any additional text you might want to add so lets say you export two sets but with some different setting…modify the word string to describe that or to add text or just omit if you don’t want any text beyond the file name…Note there should be a dollar sign character in front of the first bracket of File folder and file name…they are being stripped in the text when I post…

As for the slow speed when working on files on a server: You can speed up things by having your thumbnails on your local SSD and your RAWs on your sever, and using a fast file sharing protocol.

If you are using linux or os x, I recommend sharing the files using NFS instead of SMB (windows file sharing). The NFS protocol is not encrypted by default. The file reading speed will generally be as fast as your server can read the files or your network can send them. I saturate my gigabit connection with NFS.

My raw file library is located on a FreeNAS box. It takes me only 1/4 of a second to read one RAW file when opening it in darkroom mode. I would not notice a difference if the RAW file was on my local disc. All the thumbnails are read from my local SSD, which makes browsing through lighttable view super fast.

Hi Daniel,
I follow more or less the same approach. What I do is to have a second library for the processed files, separated from the default one.
The steps I follow are:

  • export JPG files and move them to a folder watched by Digikam
  • remove the RAW files from darktable’s default database (remove option, not delete)
  • exit darktable
  • move the processed RAWs and XMPs to their final long term location
  • start darktable from the command line with the option --library <library file>, pointing to the alternative library
  • import the processed files into this library

This way, I can keep the processed files in a darktable library that I can easily access if I want to re-export or re-edit something, but I keep the default library small and clean for when I’m editing new files.

The other option you may want to check is the ‘local copy’ feature, which allows you to have all your files in a slow or inaccessible drive and have a local copy of those files you want to work on.

Regards,
Guillermo

This sounds very useful. Do you know if the library file is just one file or is it a folder with many files in it?
I think I will just create some scripts to do the work of moving and copying and have two start icons for Darktable.
Thank you very much for this great proposal.
Daniel

The library is just a database (library.db, I think)

@elstoc @DanielLikesDT FYIThe configuration and operation files of darktable.pdf (183.9 KB)

Thanks. Just out of curiosity what’s the source of this file?

BTW on a related note, I just discovered (in /usr/share/doc/darktable) a file called darktablerc.html that describes most of the settings in the config file.

Yep that’s a good one……most are direct from the preferences tabs but there are a few that are not…the file is translated from a French author…the darktable.fr is an awesome site if you can speak
French I have translated some material and watched some video’s with translated subtitles…

1 Like

I have tried this approach and it works very well. To summarize:

  • just start darktable from your launcher or whatever you use (I am using ubuntu) to get to the default library which is basically empty except for the pictures you are working on.
  • use a script or the command line to start darktable with darktable --library <library file> to access a secondary library which is filled with the content from the server or wherever the majority of your data is stored.

As already pointed out this approach is very easy to handle and also acceptably fast for the local editing work but it is still possible to check all historical files with the darktable interface.

The only thing I am not sure about is the database and lock-files. I am not sure which of those can be deleted. I will check the proposed file.

How? I mean, the thumbnail part.

Hi, I’m trying to reproduce @guille2306’s workflow to start working with two libraries: one in a remote server (home made proto NAS), the other one in my computer, for the on-going work.
I’m getting, though, this error when starting darktable with the –library option:

gustavo@N4050:~$ darktable --library smb://192.168.1.71/fotos/HD-imagens/darktable/library.db
[defaults] found a 64-bit system with 12175304 kb ram and 4 cores (0 atom based)
[defaults] setting high quality defaults
[backup failed] /home/gustavo/.config/darktable/data.db -> /home/gustavo/.config/darktable/data.db-pre-3.1.0
gustavo@N4050:~$ ^C
gustavo@N4050:~$ darktable --library smb://192.168.1.71/fotos/HD-imagens/darktable/library.db-pre-3.1.0
[defaults] found a 64-bit system with 12175304 kb ram and 4 cores (0 atom based)
[defaults] setting high quality defaults
[backup failed] /home/gustavo/.config/darktable/data.db -> /home/gustavo/.config/darktable/data.db-pre-3.1.0

darktable starts empty.
What does backup failed mean?
The steps I took:
1 Initial database load
1.1 Detach external HD from NAS and attach it into computer
1.2 Clear darktable database (rename ~/.config/darktable folder)
1.3 Clear darktable cache (delete ~/.cache/darktable)
1.4 Open darktable
1.5 Import folder from plugged-in HD
1.6 Close darktable
1.7 Copy ~/.config/darktable folder into plugged-in HD
1.8 Open darktable
1.9 Select all images
1.10 Remove selected
1.11 close darktable
1.12 Detach HD from computer and attach it into NAS
1.13 Open darktable: darktable –library smb://192.168.1.71/fotos/HD-imagens/darktable/library.db

EDIT: I added the --cachedir option so that darktable fetches thumbnails from my computer (they were previously created in step 1.5 above), and the backup warning/error has gone, but darktable still shows no images:

gustavo@N4050:~$ darktable --library smb://192.168.1.71/fotos/HD-imagens/darktable/library.db --cachedir /home/gustavo/.cache/darktable
gustavo@N4050:~$

Does the smb:// protocol handler work? Why not just mount the share first, then refer to the local path?

If I understand correctly, you’re generating the database on the computer and then moving the drive and the database together? I don’t know if the database keeps the files paths as relative paths. If it’s an absolute path, darktable may be looking for the files on your computer, even if you moved the database.

Of course, thanks!
Now struggling against permission stuff. Since mount can only be issued by root, the local path ends up owned by root, and darktable gives this error (I think it has to do with the local path permission)

EDIT: chown doesn’t work, if I try to change the ownership of that folder.

This thing is starting to get complicated …

Correct.

Hummm… When I read from you workflow

… I could only conclude that the alternative library you’re referring to was the one located in the “final long term location” which, in my case, would be the network share. If not, I didn’t understand that part of your workflow