Workflow with slow/fast disk

Hi everyone,

I got a «big» SATA disk to store all my photo since…long time. It’s 10To external disk.

Currently I’m working on very large file (JPG and RAW) so it’s super slow. I cannot afford to buy 10T of SSD. But I got a 1To internal SSD who is pretty fast.

Is they are any way I can copy (like with rsync) a small part of my photo on my laptop SSD, work on it and rsync back to the SSD … and when I open darktable on the external disk I got eveything.

Thanks.

Z

you may want to read here: darktable 3.6 user manual - local copies

3 Likes

I have all my images (and darktable too) on an external 1Tb HDD connected via USB and it runs totally fine. My images can be well over 100Mb.

Look at the connection. Something ain’t right.

Since you have a 10Tb HDD I presume it’s a rather modern drive. Look for a fast USB port - at both ends. If you have Tiger Lake, look into Thunderbolt. Or connect the HDD directly with some type of SATA cable.

1 Like

… the disk io times totally make a difference, even internal sata vs nvme ssd. unfortunately dt’s pipeline wastes a cycle or two when generating thumbnails, so you don’t notice it quite as much. but ssd or not can totally make a difference whether you wait 20ms or 200ms for your thumbnail.

i agree during usual work on an image in darkroom mode it should not make a difference at all (input will be cached). i think the local copy features is important mainly because of a workflow where you would disconnect your external usb drives and work on the laptop somewhere else.

In addition to local copies (which @hanatos pointed out above), if you’re having slowdowns while browsing in lighttable mode, you might want to also look at the darktable-generate-cache command.

It’s supposed to be run in a terminal and it generates thumbnails as cache on your computer. You’ll want to make sure darktable itself is closed when you run this command… and as it takes a long time, you’ll want to run it overnight. It will pick up where it left off, if you do stop it with a control-C.

(Read the manual page for darktable-generate-cache for more details. It doesn’t need options, but you can use a few to specify a start and/or stop point, if you want to just generate cached thumbnails for a range. The info pane in darktable shows you the darktable IDs, so you can target a set you’ll be working on if you’d like.)

BTW: I’m using a NAS with some hard disks in a RAID for my photos and find darktable fast enough to be usable despite the hit, especially if I sync some photos locally and pre-generate thumbnails. But even without that, it’s mostly fine.

(I’m also using an Intel GPU with OpenCL, so I guess I might just be used to slowness? :grin: But darktable is no slower than Lightroom was, at least. And there are performance improves with darktable every release, thankfully.)

Thanks for the answer, sorry for the delay.

Well…my image weight more than 200Mo each :wink:

No everything work fine. It’s just I got a PCI-E internal SSD ultra fast.

Thanks you very much. I will do that

Nice !!!. Thanks you