Good to hear . Writing “feature requests” (and this is what my post essentially is, more or less) is always a bit tricky since some developers tend to be upset by such posts from users that “always want their features implemented but do not contribute back”. So again, thanks for asking!
Wow, next time I read the manual first, I promise . I guess you just gained a new user , it is something my file manager can't do.
I thought on something more simple, e.g. to allow the user double-checking paths before transferring files. E.g. just an example based on the first file. In the current version, I see the origin and the base path the files will be transferred to, displaying an example subdirectory and an example file name, both based on the actual configuration, would already be enough. As in the mockup below:
I guess this is reliable enough. Thanks for the info!
The backup feature is a nice thing, but if the backup goes to a remote location (nfs share or manually copied or whatever), bit rot may leave you with different versions of the same file. To check the file integrity remotely in the backups and also locally I use checksum files in every directory. These are generated after downloading the images and always propagate with them to the backups etc. It would be great if this could be automated. At the moment I am running the following script “md5new” after downloading the pictures:
IFS="$(echo -e "\n\r")"
for EACHFILE in `ls -1 | grep -v md5sum.txt | grep -v ".xmp$"`
if ! egrep -q " $EACHFILE$" md5sum.txt; then
md5new | tee -a md5sum.txt
This will give me a checksum file for all images and additional data but without the xmp files generated by darktable. Running this script automatically after every import in RPD (that means having a hook in RPD where I can add a script to be run after import) or, even better, additionally allow RPD to generate the md5 sums and save them with the images (in the format I explained in the last post), would be a great feature.
Since you are storing time stamp, size etc. of downloaded images in an sqlite database, additionally storing checksums in the DB (configurable, only per user request) after download would be extremely handy. Maybe in a db filed “checksum” that contains a string that can store more than one checksum, e.g. “md5:68b329da9893e34099c7d8ad5cb9c940 sha1:adc83b19e793491b1c6ea0fd8b46cd9f32e592fc sha256:01ba4719c80b6fe911b091a7c05124b64eeece964e09c058ef8f9805daca546b”. And as userinterface something like that:
With the options “per folder”, “per file” and “in database”. I guess that would allow for each and every workflow concerning checksums.
Thanks again for listening (reading) and for your software.
Best regards, Chris