

How do you backup your production servers? - agjmills

We&#x27;re a small web development agency who have around 30 dedicated servers with various numbers of websites on each.<p>All of our source code is in Git, which means that each developer, the servers onto which the code is deployed, and the Git remote all have the full history of the code<p>Our databases are backed up really poorly - a hacky script I wrote a couple of years ago that dumps them to a file, and then they get committed into Git and pushed to the remote. However it&#x27;s very difficult to extract them and can take up to a day depending on the size of the database.<p>Files that a user uploads to a server are not backed up
Server configuration files (nginx, apache etc) are not backed up
SSL keys&#x2F;certificates&#x2F;other are not backed up<p>I want to back up to the following schedule: 
Nightly for the last 7 days, then before that, weekly up to 6 months<p>The total amount of disk space we take up on all servers is around 2TB<p>All of the backups need to be backed up to the same place - ideally without a single point of failure - I was thinking something like S3-like storage?
======
l0f
database: maste-slave

uploads: rsync

there's a ruby gem called backup
([https://meskyanichi.github.io/backup/v4/](https://meskyanichi.github.io/backup/v4/))
;)

~~~
agjmills
With a master-slave DB configuration, surely you need 2n db machines in order
to replicate the database?

~~~
l0f
it's one server as master (write) and one as slave (read-only). you also must
do backups, but using this approach it will be more safe than only backups.

later, you can setup to when one server crashes [master or slave],
automatically setup a new server and copy the data.

