I am on holiday, and would like to do some preliminary culling (just exposure, lens correction, sigmoid) on a slow laptop with no GPU.
Is there a way to make Darktable render the next 10-15 images and cache the result while I am eyeballing the current one? Using DT 4.8, on Linux. I have 16 GB RAM.
prefer performance over quality
Enable this option to render thumbnails and previews at a lower quality. This increases the rendering speed by a factor of 4
reduce resolution of preview image
Reduce the resolution of the navigation preview image (choose from âoriginalâ, â1/2â, â1/3â or â1/4â size). This may improve the speed of the rendering
Set some quick algo for pixel interpolator (warp) and pixel interpolator (scaling).
Set the demosaicing algo to PPG on one image and copy that to all the others. You can revert that later.
Thanks, it was a cache issue. Specifically, the thumbnail resolution I set in preferences was lower than what my current monitor displays, so it had to be recomputed. Fixed now.
It is still not clear to me how the (disk) cache works. Now I am rerunning darktable-generate-cache, and it seems to be generating all thumbnails. Is there a way I can limit it to, say, the images taken in the last month, or a specific folder? The ideal situation would be to just have images generated for those I have visited the last month or so, and automatically GCd after that time passes.
My understanding is that the thumbnails cache on the disk has no limit, so I am not sure how to keep it in check. It has been running for an hour now and already wrote a few gigabytes.
You can specify a range of image ids to limit cache generation.
I find this a bit clumsy since consecutive image ids not neccessarily represent the collection Iâm currently working on.
As @vbs stated earlier: you may want to use the crawler instead
I apologize for my ignorance, but I donât know what the âcrawlerâ is, or how to enable it. Searching the manual for âcrawlerâ yields no results.
In discussions I also see mentions of the âbackcrawlerâ, is that related?
Enable âgenerate thumbnails in backgroundâ in preferences and the crawler will automatically generate thumbnails as background activity as long as there is no user interaction for at least 5s (you can change that value in darktablerc)
Thanks, but will this generate thumbnails for all images in my collection eventually? That is not something I really want.
For most images I pretty much finished editing so I donât want them to have thumbnails â I have about 35k images in Darktable, and work on about 1kâ2k at a time (reducing it below 200, or ideally 100, after I get back from a trip). Making high-res thumbnails for them is a waste of CPU and disk space.
If you insist on the thumbnail cache being temporarily you probably want to disable the secondary disc cache at all (This might decrease lighttable performance ) or manage your cache directory on a filesystem basis otherwise.
Subject to the *atime family of mount options; relatime is the default:
relatime
Update inode access times relative to modify or change time. Access time is only updated if the previous access time was earlier than or equal to the current modify or change time. (Similar to noatime, but it doesnât break mutt(1) or other applications that need to know if a file has been read since the last time it was modified.)
Since Linux 2.6.30, the kernel defaults to the behavior provided by this option (unless noatime was specified), and the strictatime option is required to obtain traditional semantics. In addition, since Linux 2.6.30, the fileâs last access time is always updated if it is more than 1 day old.
In my opinion - most likely you donât need all the 35k images permanently imported in DT.
Given the constrains
slow laptop
working on 1-2 k at a time (preferably 100-200)
want not to loose extra space and cpu power
I would remove the extra images so DT does not look at them. The processing is still part of the .xmp and they can be re imported when needed.
When we want to see the images in light table we need the thumbnails - just have to decide how big.
if the thumbnails are generated ahead of time - we loose disk space
to generate them we have to use the CPU when the computer is idle (so they can be ready).
if we donât want to loose the disk space and to use CPU power when the computer is idle - then we have to use CPU when viewing.
Not sure how everything can be fulfilled without sacrificing on at least one item.
If we are to assume that we are to work with non processed mages and rely on build in preview - in theory there can be no thumbnails generated at all. But once the images are processed - then the thumbnail will be based on the processed image not on the embedded .jpg
Also in 4.8 workflow based on the embedded .jpg is slow because of
Not sure what I am missing but it is almost like a catch 22.
If I was in your shoes - I would likely remove most of the images and just import batches and move images between folders processed / not processed / working - something like this.