Best backup practices

Just have withdrawn my last (very hasty) post. I will try to provide later on a more complete post on backup strategies. The first point of data saving is data organisation. You have to think about how to organize your assets. The second point is versioning. As you will change or edit pictures, you have to think about versioning of files (there are various methods for doing this).
Third think about a Backup medium. A small NAS is well affordable can be mounted via fstab on boot time to provide always the same path.
Last is doing the backup: here technology is secondary, primary is to have a simple way to recover single lost files.
Specially: if you want to first backup but modify later on your source (say by deleting low quality images-) there has to be a save way to sync this change to your back directory. GitHub - bashforever/safeback: Bash script to backup directories and save differences in target structure in SAVE directories is one approach to do this. (this is beta - explanation follows later on)

So just wait a littlte bit and I will provide my thoughts more in detail.

Cheers

Immanuel.

I always use rsync for backing up data.

Rsync with the switch “–delete” will delete the files at the remote end. Without specifying “–delete”, files at the remote end will not be deleted. As such the crux of your problem.

For example, so long as the drives are formated EXT and are not FAT/NTFS:
$ rsync -ax --delete /source_folder/ /remote_folder/

(FAT/NTFS formated drives will required a mix-mash of extra flags for omitting permissions, etc…)

Best backup practice is likely to use an external Firewire/USB-2/USB-3 hard drive. Even better, put a NAS in your garage in the event your house catches fire, but then you’re likely using a slower gigabit wired network.

The current costs of commercial remote storage is probably not feasible with the data accumulated with RAW images. But that’s just my opinion.