Newbie workflow question

Hi

I’m very new to darktable and haven’t used it yet. I’m coming from olympus workspace and adboe CS2. I have an inital qestion about workflow and how to work in darktable.

My current process is very time-consuming. I shoot wildlife mostly so end up with thousands of files from a shoot, often abroad where I only have a windows surface PC. what I normally do is:

  1. I shoot Jpeg and raw.
  2. I save each day’s photos locally to the laptop and back up to an external hard drive. I don’t tend to edit until I get home as I don’t have time.
  3. when I get home, I tend to do all my processing on the files on the external hard drive using my laptop. I just got a new laptop with a 1TB ssd so I’ll most likely copy the files off the tablet/external drive and work on them on the laptop hard drive.
  4. I have tended to create individual sub folders to group the day’s images into smaller batches to work through,
  5. I give the best a star rating, sort those into a separate folder and then delete the rest off the drive
  6. I then process the best images as needed
  7. At the end of the shoot once I have finished processing the best images, I upload the best of the best to Flickr
  8. Then I will clear them off the laptop/external hard drive onto our raid array storage which is housed in a desktop PC.

My question is. What’s the best way of streamlining this workflow using darktable? I’ve been watching Bruce William’s video tutorials and the copy local function seems close but it seems to be almost doing the steps backwards to what I’m used to. If I use my laptop hard drive as the main repository, how do I back up to the raid array and keep the files accessible if I need to revisit them? Are there any tips for managing large volumes of files created pre darktable?

Many thanks

Jo

Welcome!
Not sure if this fits your large volume editing tasks, but I store all my image files on my local server, using Rapid Photo Downloader (a neat program!)
Then I open darktable in my notebook and import the folder I’ve just created with RPD.
Sometimes I do a local copy, sometimes don’t.
In my first nas setup, I first ran darktable from terminal just to create thumbnails in my notebook, thus accelerating Lightable functionality.
In the end, I have darktable’s database in my notebook, as well as the thumbnails, and the images and respective xmp sidecar files in the server.
But I’m not sure how it would behave if I wanted to edit in large volumes at once (e.g. by copy/pasting edits in lightable).
Let’s wait to see what others say about your case.
EDIT: From time to time, I backup the database and config files, which sits on ~/user/.config/darktable, or on that app thing folder that I forgott the name, in Windows.

I don’t think you can streamline that workflow very much, using any program (or I lack the imagination because I do mostly the same :stuck_out_tongue_closed_eyes:). A couple of points:

  • step 4: you could replace it in darktable by applying color labels or tags, and then filter by them using the collections module (you don’t need to move files between folders)
  • step 5: idem above, you can filter by star rating and only see/work on them/delete the rest
  • step 8: I have my own solution, which is to remove the images from the main library, copy them by hand to the server, and then import them into a different database (darktable --library option). That way I still have access to them if I need to re-work on some image, but they don’t usually pollute my daily working library. You can then use this ‘archived’ library in combination with the local copies feature to avoid the network slowness.

I wonder if for this kind of workflow it wouldn’t be better to work without the image database (darktable --library :memory: ) with of course XMP sidecars enabled. That will in part depend on how many dt sessions one batch takes (probably faster startup when using an image database).
Of course, as soon as you start using sidecar files, you have to be careful to keep them with the original image file…

It also depends on how the PC with the raid array is used: is it purely a backup, or is it a work station? if the latter, having a dt install there with a database can be advantageous, as it allows you to search on tags (location, species, …)

Concerning the “pre-darktable” files: with your workflow, you can’t keep the dt database up to date, unless your rad PC is a workstation (in which case you need another backup…). Also, I don’t see anything about tagging in your original description. So I don’t see any difference between pre-dt and dt files.

And are there any particular reasons to shoot raw+jpeg? it’s another link between files that has to be kept intact, and I don’t see anything in your workflow that indicates a use for the jpegs.

Hi thanks for the advice. I shoot raw and jpegs as I mostly just use the jpegs unless I want to edit them, in which case I use the raw. Also sometimes use the in camera crop so having both is useful.

The raid array is purely for backup storage, it’s not powerful enough to use as a workstation…