Do you use NAS?

How do you archive your digital work (RAW, JPG)?

Do you use NAS? How many copies of your work do you keep? Do you use dedicated NAS HDDs (with increased MTBF)? How often do you change HDD to a new one?

For now I have only a single ‘backup’ 2.5" HDD and keep my photos on both SD cards and HDD but it’s time to have a proper solution.

1 Like

I keep my data on a selfmade “NAS”. It’s just a mini pc with eSATA HDD. Every night I do an automatic snapshot using borgbackup to an external server on the internet (two HDD, raid1). Additionaly, every second month or so, I synchronize the data partition to another PC in my home LAN via rsync. I also have an external USB drive at my office at work for a “worst case scenario” which I haven’t synchronized for some time (actally I should do so soon).

I would hate to lose all the photos I have taken in the last decades.

1 Like

I’ve said it elsewhere, but it’s worth repeating I think. At a minimum it would help to keep in mind the 3-2-1 rule for backups:

  • 3 copies if your data
  • 2 different types of media
  • 1 copy in a different physical location

I personally have a NAS for storage that backs itself up to another drive in the same place (my house), then a copy gets sent off-site to another location (friends house? work? Amazon Glacier? Carbonite/Mozy?)
I should be burning blurays or tape, but I’m lazy.

In short, whatever you budget for hard drives for backups, multiply by three, and add a little gift for a friend to use their internet. :smiley: (If you get two NAS drives, and your friend allows it, you can possibly get rsync installed on the device itself and allow you to sync remotely just the changes.)

3 Likes

We (my girlfriend and I) have an NextCloud instance running on a server (Mini ITX PC with atom CPU) on the local network of our home.

The server have an hardware RAID controller configured in Raid 1 with one hot spare.

Thanks to the Nextcloud, we have 3 copies of our file. One on the server, one on my computer and one on my girlfriend one’s. None off-site.

No I’m using regular HDD

I don’t do preventive maintenance. Maybe I should :thinking:

One side advantages of this solution is that we can share digiKam album.

Hope this will help your thinking :wink:

Yes, dedicated drives all around. Each was bought at very separate times, different brands; I know from work experience that the worst thing you can do for redundant backkup is to use same make-model drives bought and put into service at the same time. They burn out like your car headlights; when one goes, the other isn’t far behind, quite likely to fail well within your mean-time-to-repair.

The dedicated hard drive thing offers another advantage: when you go to do something like upgrade your OS, you can physically remove the drive and keep the installers from having fun with it. I just did that very thing last night to the #2 box; removed the Pictures HDD, and proceeded to upgrade from Xbuntu something like 14.0 to Ubuntu 18.04… after that was done, just re-installed the drive, made the mount point directory for it, and put the entry back in /etc/fstab. Okay, not Windows-easy, but it worked for me…

My NAS has a three disk ZFS mirror of WD red drives that are 3TB each. The server runs NixOS with an installation of gitolite that supports git-annex. I sync my git annex repo of raw files to my NAS. The same got ammex repo is also copied to 6 different external disks, three of which are off site and rotated periodically.

1 Like

I read through your git-annex travails a couple of months ago, did some cursory research, and really didn’t grok the application. Could be my dementia…

I need to formalize my workflow into an article with some diagrams and what not… More or less, just think “what if git, in all its distributed glory, could handle binary files as well?” And then you get git-annex!

I suppose that’s better than nothing. But no off-site? That’s dangerous. You can look for cases of photographers who’ve lost their life’s photographic work because of fire or flood or theft, simply because they had no off-site backup. Unfortunately they’re not hard to find, whether serious amateur or professional.

4 Likes

Haha I have to disagree with you there! I’ve been using git annex for a very long time but have scaled back my usage down to just a sort of datadump. I had it on my phone was planning for my photos to be managed using it. But I kept pointing it at my foot, thinking this is good, then pressing the trigger thinking it would somehow solve a problem of mine. Boom! It’s one hella complex software.

So I’m no longer using it for anything important. Come to think of it git is also famous for it’s foot shooting capability!

It isn’t great at everything… What problem were you trying to solve?

True, git is not that straight forward, but I feel like if you understand git, then git-annex does not add hardly any complexity to it… if you understand git :smiley:

I also use a NAS Server with dedicated HDDs that is regularly doing backups on an external drive and also sends all the photos in a cloud.

It worked good enough until now :slight_smile:

The NAS Server does ‘multiple duty’ also as a media server and a home automation server :wink:

I just realized that you likely meant server-quality hard drives (WD Reds for instance). I have another server that is my media server that I’m constantly on the lookout for HDD space and if you get far enough down the rabbit hole you’ll find things like the WD EasyStore (8TB) actually use WD Red drives internally, and are often sold at a price far below a bare Red drive (reddit has /r/datahoarder as a great resource for the specifics).

2 Likes

I know but I already struggle to setup the server, so I’m in the feeling that on the Internet will be out of my reach.

" Offsite" can be as simple as another copy on an external hard drive, then give it to a family member (well one you like, anyway…), or bring it to the bank and put it in a safe deposit box, or some other “sneakernet” conveyance. I just did this last month, gave it to my adult son (whom I do like) to hold onto…

4 Likes

I’m not a pro, just an amateur, so my solution differs from all of you (no NAS), but may help another amateur.

I have a few cronjobs running rsync at night, duplicating my photos daily, weekly and monthly at different folders and a different HDD. So anytime I have 7 copies on two HDDs at my house.
Also another cronjob running rclone to Google Drive in case of a fire. :slight_smile:

1 Like

Reading your answer give me the idea to keep an HDD at work. Doing so I will have an off site backup.

Thank you for the hint :smile:

3 Likes

I have a WD MyCloudMirror and it is definitive nothing I would recommitment to anybody else. It runs with old software with known security holes …

fwiw

I’m a linux guy. Retired software engineer. I don’t like database oriented systems like Darktable. I won’t go into why. I just don’t.

I keep a hand made file system hierarchy with descriptive names

Birds
–Costa Rica
----Jan_26_2019
------raw1
----------jpegs
------raw2
----------jpegs
etc

In any given directory I can make keywords to search on with a symlink or an empty file:
touch Birds/Costa-Rica/Jan_26_2019/Raw1/Toucan

…from a terminal window command line:
sudo updatedb
locate -i toucan
(there might be a half a dozen directories were i have toucan images. Now they’re easy to find)

The jpegs. If I make a jpeg from PIC_1234.NEF (nikon raw) I would call it something like:
PIC_1234_Bald-eagle.jpg That way I can always forever more match PIC_1234_Bald-eagle.jpg to the PIC_1234.NEF it was made from, usually but not always found one directory above the jpeg.

I have two 8 terabyte drives. One mirrors the other. OS on a flash drive.
Once a month or so I copy the backup directory to a usb drive. It takes all night so I do it
as I go to sleep. All three of those discs would be lost if the house burned down. I should fix that. But I likely won’t. I’m old retired and lazy now.

If my 8 terabyte drives fill up (62% full now) I will go to NAS system somehow. Building one from parts would be a fun project.

Finally (I promise) I taught a 3 hour photography class a few years ago, as part of a state-wide Audubon Festival. The number one problem the class reported was how to archive images. They came to learn about F-stops and image processing too. But they all seemed to be pulling their hair out trying to find lost images, or with trying to learn the menus of a complex commercial image archiving software system. KISS is the software designer’s best friend. Keep It Simple Stupid.

3 Likes

I’ve got a Synology DS918+ with four pretty garden variety 1 gb Western Digital drives. Nothing fancy.

All my images are stored on the NAS in a volume that I access from a number of machines on my home network both wired and wireless through a Ubiquiti Amplifi Wi-Fi mesh system.

The NAS is backed nightly up to a standalone USB drive via “Hyper Backup” .

Also, the images are backed up to my Amazon Prime (unlimited photo storage) using “Cloud Sync” again, every night.

This all works seamlessly and invisibly without my intervention.

All the working folders on my various computers and a “work folder” on the NAS are synchronized real time using “Resilio Sync”.

I like the Synology box. The support and application availability have been superb. Certainly much better than the ReadyNAS that the Synology replaced.

1 Like