Hacker News new | past | comments | ask | show | jobs | submit login
The paranoid person's guide to a complete Mac backup (macworld.com)
36 points by bemmu on Oct 8, 2015 | hide | past | favorite | 38 comments



I think most devs here learn to keep their laptops relatively stateless. All important files gets offloaded to Dropbox / Github / S3, whichever service makes the most sense. Frankly, whatever I lose if my laptop HDD dies must not have been that important if it never made it to one of those services.

Implementing and maintaining whole-disk backup procedures and media? Shoot me now.


I think you're underestimating how long it takes to set up workflows, tools, device drivers, system configurations, path variables, and so forth from scratch. If you rely solely on project repositories, I reckon you could spend multiple man-days getting your machine back to a pre-dataloss state.

I just had a laptop stolen. Even with Time Machine backups, you still spend many hours setting configs after a restore.


> Even with Time Machine backups, you still spend many hours setting configs after a restore.

But how many hours do you spend establishing, testing, and maintaining a backup plan that mitigates the risk of spending a few hours getting back to working status after a restore?

In my experience, TimeMachine is wonderfully effective at restoring system state (including things like shell customizations, dotfiles, etc). The marginal return on investment in more comprehensive backup setup just doesn't pay dividends. Of course, there's the issue of offsite backups, which typically aren't as comprehensive as TimeMachine.

The point I'm making is that technical folks often have a hard time making value judgements as it relates to technology. Having a backup of your data is 100% necessary. No argument there. Having a backup of your system state? That's not as high a priority, because you're mitigating a future time investment with a current time investment. You have to weigh the two against each other: time spent today versus potentially time spent in the future.


When I had a Mac I used Carbon Copy Cloner to image the disks solving that entire problem. It creates bootable backups then which you can restore to exactly the original state. Time machine was a veritable pain in my arse on numerous occasions including one that nearly shafted everything I had. I genuinely don't trust it or HFS+ and never would ever again.

TBH I've moved to Windows as a primary platform now and I'm just using Beyond Compare to sync my data by hand to bitlocker encrypted USB sticks (3x 128Gb Corsair Survivor) and keep one on my keyring, one in the car and one at home. Takes about 2 hours to build a windows machine from scratch now including OS, Office, dev tools, updates onto an SSD. I imagine that would work reasonably well on a Mac too if you skipped half the junk that appeared in the user's profile.


I've found things easier than that with Time Machine, but in any case, I agree with your basic point. (Time Machine has the added bonus of turning the setup of a new system into something that takes a couple hours rather than a couple days--at least unless you want to take the opportunity to start clean.)

Should you have additional on-site and off-site/cloud backups? Sure. I also use Backblaze as well as having a periodic on-site data files backup. (Plus whatever incidental backup is provided by Flickr and iCloud.) But, from my perspectivem it doesn't hurt to start with a whole disk backup, especially given how easy Time Machine makes it.


It's something you pick up over time with experience. I've switched computers so many times over the last 20 years or so, I don't sweat a sudden loss. Happened just last year, dumped coffee all over my MacBook Pro, had a brand new MacBook Air the next day, didn't bother moving anything over except for one file I was working on that I hadn't yet synced up.

I was back up and running within a couple of hours.

There's this process that runs in my head whenever I work on my laptop. (as opposed to working on a coding project) I always keep in mind what the dependencies are, what it would take to replicate this if I lost it.


I've started using tools that have sane default workflows. Learn to live with the defaults I guess :) Then the hurdle is just getting programs installed. Which I'd prefer to do clean slate instead of from a backup to get rid of cruft.


My work computer is also my home computer. So not losing photos and videos is pretty critical. Everything else is replaceable.

I just bought a cheap Seagate Backup Plus 4TB last year. I think it was $130. Then I just let Time Machine do it's thing. So that's 2 copies of the important stuff.

Then $50 annually for Backblaze. I've got no real use for Dropbox and it costs twice as much for less storage. I could totally see using it in a media company. That's just not a problem I have. I pretty much never need to share a personal file that's too big for iMessage (photos/videos) or Email (spreadsheets). Anyways, that's a third copy.

Then I've got iCloud to share photos/music/videos on our Family Sharing. That's a fourth copy.

So everything is pretty much always in sync. Wether it's my wife's iPad, my laptop, either of our phones, etc.

Into the second year it's cheaper than just having Dropbox alone. But the best part: It takes... I dunno. 10 minutes of one time setup effort. You don't have to know anything but how to click "Next" or "OK" in a wizard, and it's zero maintenance. Nothing to baby-sit.

And it's a little slow, but if I really need an individual file off Backblaze, I can get it. Just takes a couple minutes to use their web based backup browser and selectively download something. I've had to do that a total of one time though.

Beyond that setting up a new machine for me for development is just pulling my .bashit theme, installing homebrew, installing IntelliJ, running brew install sbt git and I'm pretty much good to go. It was really surprising how quick setting up my Macbook for Scala development was compared to setting up previous machines for Ruby development.


> Backblaze

The basic problem with Backblaze is that they delete remote files 30 days after they're deleted locally... which means that if something important is intentionally or accidentally wiped and you don't notice for a month, it's gone forever.

That's why I stick with Crashplan, despite the general annoyances of their non-native desktop apps.


Oh? I didn't know this! Maybe Crashplan should spend a bit more money to advertise on podcasts.


Problem with Crashplan is the limited pipe they have. If you have few T of data like I do it will literally take you months to back it up or restore.


For those two types of files I'd use S3, perhaps replicated somewhere for redundancy since they're that important.

But it looks like you've got a working solution already.

I develop in Ruby, and for some reason I have to set up a lot of Ruby environments, way more than I have to set up work machines. Rbenv, Sublime, iTerm2, Homebrew, Postgresapp, Bundler, done, ready to bundle then rake db:setup. I have a little checklist that I just run down, but honestly I really don't need it.


I erased my disk before installing El Capitan. I wanted to force myself into good habits. I'm currently learning how to manage all my dotfiles, vim plugins etc with git. I'm not quite there yet however.

I read his article before erasing my disk, but found it way overkill for my needs. Depends on the user.


Any suggested reading to become a better person and manage dotfiles sanely?


What I found the most useful for my use case are the git submodules documentation[1] and this blog post[2]. Read them and you'll have a basic setup in no time. Some people want to automate everything and have modular configurations depending on the environment. That's just a bit more involved, but not much, and there are plenty of tools to help you with that.

I have just pushed my dotvim folder to github if you want to take a look[3].

[1]: https://git-scm.com/book/en/v2/Git-Tools-Submodules [2]: http://blog.smalleycreative.com/tutorials/using-git-and-gith... [3]: https://github.com/belmarca/dotvim


I like to make a repo, and then have subfolders for each thing -- emacs configs, bash configs, etc -- and then symlink them in. Bonus would be to write a script that automated doing the symlinks (e.g. ~/.bashrc -> dev/gk-config/bash/.bashrc , ~/bin -> dev/gk-config/bin), which I have yet to do.

Then, commit changes regularly. I have no idea how to handle multiple __different__ machines well, though.


GNU Stow does the symlinking for you.

https://www.gnu.org/software/stow/

What I do with different machines, (just OSX / Ubuntu atm) is a bash function to check which type of system I'm on and conditionally execute .profile / .bashrc code on that basis. Also I look for a .bashrc-local file and source it if present. That lets me add truly machine-local code.


There are lots of articles written, samples, et al. One such simpler article that I read recently was http://www.anishathalye.com/2014/08/03/managing-your-dotfile...


I suggest that you pack them into a git repo and use GNU stow to deploy them.


Started doing this myself, it's a fantastic solution to a common problem.


TIL: Backups to the cloud are still a solution in the Snowden-Era.


You can and should encrypt them.


Yeah that helps if your machine or something in between is not compromised and the other side is not too interested. If this is not the case, your off-line data is still off-line and not reachable while data in the cloud is easy obtainable by the attacker so he can do with it whatever he wants.


Afaik at least Crashplan uses strong asymetric encryption so you can store your decryption key offline. If a party is really interested in your data they might as well obtain it with a wrench. Don't get me wrong, I do think you're right about this. It's a tradeoff between convenience and security. Or viewed from a different angle: Is it worse that your data might be read by a third party or that you lose your data because you forgot your manual backup. And which scenario is more likely?


I don't dare to speculate about the likeness of security breaches anymore. But I have a pretty good backup routine.


Many of the listed "paranoid" backups suffer from a single point of failure: your computer. If your computer gets compromised then it can silently erase/cryptolock files on your computer or even on your backup drives in the rare event when they are connected. IMO in practice it could only be feasible with a targeted attack after a thorough investigation of your backup plan (assuming if you don't make it public on your blog).

I don't know crashplan but AFAIK most backup services suffer from the same problem. If you regularly enter or store account credentials for your backup service then it can be stolen from a compromised computer and can be used to delete backups from the past. It can be mitigated if your backup service provides two accounts for the same backup: an "append only" account and an "administrator" account. The idea that a compromise of the "append only" account can't delete your past backups.


A sophisticated ransomware app could do this: stash away a snapshot from the users cloud backup (e.g. CrashPlan) and then change the local backup configuration to backup no folders at all, which in the case of CrashPlan also removes the historic archives of the users' data, as history is only kept for those folders that are currently backed up.


> It can be mitigated if your backup service provides two accounts for the same backup: an "append only" account and an "administrator" account.

That's how Tarnsap[1] works (I think the author is the user cperciva)

[1] https://www.tarsnap.com/


Tarnsap

Tarsnap. But yes.


Most of my important things (and all the things that need to keep a history) live in repositories on Github and/or Bitbucket. All the system configuration is done via Boxen. I have 2 rotating clone drives (weekly) for quick disaster recovery. For all the long-lived things that don't need explicit versioning I use arq to store on S3. Arq it terrific BTW.


A few days ago a livecaster accidentally burned down a large part of his house (https://youtu.be/82UsZ44AxIA?t=290). Besides buying a fire extinguisher, that got me thinking I should have remote backups. So far I've been only using Time Machine to back up to an external disk in another room.

Somehow I'd like to have a periodic whole disk image of my mac that could be downloaded and mounted on a new computer as-is. Currently investigating if I could use Carbon Copy Cloner to do a weekly bootable disk clone and then have that be uploaded to Dropbox.

CCC creates a huge directory called "Macintosh HD.sparsebundle". Not sure if that's really mountable or if syncing it to Dropbox will be practical.


http://pondini.org/TM/Works3.html - seems like it. I'm not sure it'd survive Dropbox - maybe a Mac user can chime in here?

Edit: It seems perfect: http://blog.fosketts.net/2011/07/05/mac-dropbox-encrypted-vo...

The closest for Linux would be ecryptfs, but I'm not sure if it splits images up.


The part of that whole affair that's simultaneously hilarious and incredibly sad is that the guy literally couldn't have tried harder to make that fire grow. He gave it kindling doused in lighter fuel, then he gave it more kindling and wood to burn, then he literally fanned the flames.

Not to say that having a fire extinguisher is a bad idea, but in that guy's case doing literally almost anything other than what he did could have saved his house.


I love apps like Carbon Copy Cloner and SuperDuper, since they allow me to create a bootable backup of my system drive without booting into a separate backup program. Are there similar solutions for Windows/Linux? (I only know of Clonezilla, which is a boot system)


I would really like a good archiving service. Backup is one thing but I can't keep an ever growing stash of data, I want t offload this data in a redundant way. Is there such a thing? Backup is not archiving!


I'm currently traveling and backup to an external hard drive and photos to Google Drive. What's the best way to keep encrypted backups of important files in the cloud?

I was thinking about using duplicity and S3


If you are paranoid don't forget to check for bitrot!

https://www.npmjs.com/package/chkbit


I used to back up stuff. now i don't.

eh. im okay with loosing stuff.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: