Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

server wise i use a very simple script to dump an archive to S3. http://paulstamatiou.com/2007/07/29/how-to-bulletproof-serve...

but for personal computer files I manually put important files in an S3 bucket or two. not the most efficient but it works. I'm considering writing some rsync to S3 type of thing. regardless i love having my stuff online (and safe). my MBP doesn't have much other than essential apps and a few files i'm working on at the moment. most other things are online.

the only exception being media which i sync to an external drive manually every few months.



You may be interested in Duplicity, which can do encrypted backups to S3: http://duplicity.nongnu.org/ Duplicity uses the rsync algorithm for data transfer as well.

For those creating archives, you may be interested in xar: http://code.google.com/p/xar/ It preserves all the metadata such as SELinux information, ACLs, EAs, etc. On Mac OS X, it preserves all that, the resource fork and finder metadata as well. Two options are particularly relevant to backups: --link-same makes identical file data into hardlinks, reducing space consumption both within the archive, and on disk when extracted. --coalesce-heap when files with identical data are encountered, the data is only stored once in the archive, reducing the size of the archive.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: