My digital photography is eating my hard drive space. Some of the 16-bit working GIMP files are more than 50MB each. What are you folks using to archive images? Is the cloud as good as having an external drive?
Personally, I do not trust clouds for important things – my Internet connection may be down &c. Instead, I look at it in this way: how much does a 1 terabyte external HDD cost these days? Divide 1 terabyte with 50 megs (i.e. the size of your files), and you will see how many images it will be able to house.
Have fun!
Claes in Lund, Sweden
PS: some ill wills have started to call it the fog instead of the cloud.
Cloud might be cheap and enticing in the short run, but you can get a 4TB external drive for <100USD. That’s going to be hard to beat in the long run.
I agree with your reasoning, plus I found a good page dedicated to the topic …
My source (NewEgg) isn’t quite that good. Where have you seen that size/price combo?
Photos I decide to share online get stored in Flickr and others get put onto Google Images. In addition, I have a HDD dock and some drives that I copy new files onto. Those get stored in a fire-proof safe.
I turned my old Hackintosh (Core 2 Quad machine) into a NAS file server using NAS4Free software. The file system is a RAIDZ2 pool of 6 4TB hard drives, which will tolerate 2 simultaneous drive failures, and gives me plenty of room. When I run out of space, I can simply replace the drives with larger drives, while the system is running! If you have an old PC kicking around, you might try this approach.
I back up to the NAS as soon as I am finished importing photos from the memory cards.
On remote shoots, I have an external backup drive attached to my MacBook Pro as a Time Machine backup. It will automatically back up photos as they are imported and processed. When I get home, I copy everything to the desktop and NAS.
I don’t have an off-site solution yet, because I don’t want to pay for cloud storage, and there isn’t a good inexpensive archival medium for multi-TB datasets that I know of.
I see at least 3 4TB external drives at Newegg for US$100 or less:
Costco often has some great prices!
Every now and then I store the newest portion of my archive on 25GB M-Disc BDs. Complete sets of holiday shots are burnt on standard 100GB BDXL media before culling.
I bought a hard drive dock, I can plug in a bare 3Tb drive and at a whim, pull it out and plug in a different Hard Drive. I don’t trust online backups for anything critical - due to sync, retrieval and other issues. If you want something done right - do it yourself.
I completely agree with @paperdigits and @Claes . But I would not rely on just one external drive!
The unprocessed scans from my slides are 260 MByte / image in TIF (the processed JPGs are negligible compared to that)! In addition I have the images from my digital camera in use since 2004. For everyday work I have all images on an internal 2TB disk (almost full now). Backups are made on irregular intervals to 3 external 2TB disks. Much work goes into documentation of the images in the metadata, which I do on small preview images in JPG format for speed and security reasons. The JPGs with the metadata are copied onto a memory stick every time I have spent some time working on that. When processing of the scans is done, the metadata are copied to the processed JPGs and unprocessed TIFs for archival purposes with an EXIFtool script and saved on the external three disks.
Hermann-Josef
Indeed, not just one! I actually have eight redundant copies of my photos.
I have had it in mind for a long time to write up a storage and backup article on the site. It’s a topic that many folks are often interested in, I think, and it can hopefully provide some value.
I would personally recommend thinking of the 3-2-1 rule for data backup:
- 3 copies of your data
- 2 different types of media
- 1 copy in a different physical location
Sometimes I might relax the 2 different types of media (I keep everything on spinning drives personally).
@paperdigits makes a valid point: current HDD prices are quite cheap. At the roughly $100 for 4TB price point, this is only about $0.03/GB. Because I’m paranoid I always try to have a pair of HDD to rsync between - so buy two if you can…
I really should sit down and write an article for folks.
Just a remark about hard disks for archiving.
When I purchased my disks I asked the guy in the shop about his experience with expected lifetime. He pointed out, that if a disk has a defect, it will show up pretty early in its life. If after some time of usage, no problems are encountered, one could be pretty sure, the disk will last for a long time. This means, that one should not just copy the images to the disk and then store it away. One should use it regularly for some time and then one could be confident that it is a good archive and leave it untouched.
Hermann-Josef
I use the badblocks
command on linux to read/write to my drives. The command usually takes 6-12 hours, so when done, I’m reasonably sure the drive isn’t going to fail right out of the gate.
Also have multiples, as @patdavid said!
There’s a reasonably well known curve for hard drive failure, the bathtub curve that describes this.
I wonder if the bathtub curve idea was conceived while in the bathtub reflecting over a week’s failures. #bathtubthoughts
It may be worth examining what threats the backup should guard against, and how effectively the backup (and recovery) protects against the threats.
Also weigh up the costs of backup (including time, if any) and the possible cost and likelihood of each threat.
For me, the threats include (starting with the most significant):
- Finger trouble: oops, I shouldn’t have deleted slash-star.
- More finger trouble: I once dropped a laptop five feet to a marble floor.
- Bad electricity: a thunderstorm once took out my computer 30 minutes before a deadline.
- Computer crashing while files are still open. Save those files frequently!
- Hard disk failure. Used to be common, but last time for me was 8 years ago when an unprotected external drive bounced around the back of a Land Rover.
- Thieves stealing my stuff. Touch wood: not yet happened.
- My premises burning down. Ditto.
- A virus encrypts my files, holding them to ransom. Ditto.
Some backup methods will guard against some threats but not others. For example, a hot-standby (auto-backup as I work) won’t help me if I’ve accidentally edited a master image file instead of a copy.
Think about single points of failure: would the failure of any single piece of hardware or software cause data loss? For example, a power glitch or office fire during a backup operation could destroy both the original data and the backup copy.
Always test the recovery method. In my professional life I encountered companies who never did this. They spent much effort and money following a backup regime that didn’t guard against the risks they faced. Sometimes recovery was impossible, so the backups were entirely useless.
Also consider archiving. This has different objectives, but some commonalities with backups.
On a personal level: I’m no longer a professional photographer, so no-one will sue me for losing their wedding photos. My internet connection is lousy so cloud backups are unusably slow. I do internal backups (from a disk drive, to the same drive) and backups to multiple external drives. My last unrecoverable data loss was more than 20 years ago.
So much this. A backup system not fire-tested to restore is not a backup.