Hacker News new | comments | show | ask | jobs | submit login

and you dumped the entire 50terabytes every hour? The point above was that they should have been doing a db dump. that's not always the best way (or even possible) to deal with large data sets.

No, but why would you do that? It makes no sense when there are better backup strategies available (archived redo logs, hot standby, filesystem snapshots, etc etc).

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact