

Ask HN: What backup strategy do you use for your websites and databases? - pankratiev


======
agj
For backups, I use rdiff-backup across 60G+ of user data, nightly. rdiff-
backup uses librsync to transfer files, but also handles incremental backups
and seems to be fairly efficient at storing increments. Nightly, maybe 10%,
maximum, of the user data changes and backups complete in less than 2 hours.
Load is low enough I could run several times a day if I needed.

Databases should be properly dumped to a file before back up.

I've also tried BackupPC, which was a great project, but probably not the best
fit for this case. I was running it in a virtualized container, and I ran into
a lot of memory issues backing up large servers. This issue likely came down
to memory -- not necessarily BackupPC itself -- but I dropped it because
backups commonly took around 6-8h, if they didn't silently hang on me.

------
ScottWhigham
I make local backups and then have a routine that downloads those backups. I
tried one too many times relying on web hosts for backups...

------
adyus
Hmm, given that the question was asked on HN, there could be a good business
idea there... File and DB backups offered as SaaS.

------
latch
I actually check that I can restore from my backups once a week (it's the
first thing I do when I wake up Saturday morning).

------
sander
There actually is a service that backups your website through FTP regularly,
just trying to think of the name...

------
europa
I use duplicity. <http://duplicity.nongnu.org/>

------
timdev
rsnapshot with some scripts to prepare database dumps to grab ensure mysqldata
is included. Works very well.

------
Zakuzaa
I rsync my backups to bqbackup.

------
LaggedOut
Backups, who needs them ;)

