Hourly Production Server Database And File Backups

Overview

It's incredibly dangerous to run a production server without setting up automatic backups. I often go with hourly database and file backups on my production servers. If anything goes wrong, we've got a recent backup to restore from.

This is pretty simple to setup with the Backup gem and has all the features we could need including email alerts when things go wrong.

Install the Backup gem

Login into your server with the same user account that your app runs as. For me, this is the deploy user.

Assuming you've already setup your server, you should already have Ruby installed. You can simply install the latest version of the backup gem like so:

gem install backup

Setup your Backup script

First we want to generate a new backup script:

backup generate:model --trigger production_backup

This will generate a folder in your user's home directory called Backup It will have a couple files inside:

  • config.rb - This is your main Backup configuration file. You can read through this if you like, but you probably won't need to change it
  • log - This is the where the backup logs are stored
  • models - This folder contains your scripts, specifically production_backup.rb that we just generated

Now we can edit ~/Backup/models/production_backup.rb to backup our database, files, and notify us.

Be sure to change the parts that are in all CAPS to match your server setup.

Open this file with Vim or Nano. I'll be using Vim:

vim ~/Backup/models/production_backup.rb

Replace the contents with the following config:

# encoding: utf-8

##
# Backup Generated: production_backup
# Once configured, you can run the backup with the following command:
#
# $ backup perform -t production_backup [-c <path_to_configuration_file>]
#
Model.new(:production_backup, 'Production Backup') do
  split_into_chunks_of 250
  compress_with Gzip

  ##
  # MySQL [Database]
  #
  database MySQL do |db|
    # To dump all databases, set `db.name = :all` (or leave blank)
    db.name               = "DATABASE_NAME"
    db.username           = "DATABASE_USERNAME"
    db.password           = "DATABASE_PASSWORD"
    db.host               = "localhost"
    db.port               = 3306
    db.additional_options = ["--quick", "--single-transaction"]
    db.prepare_backup = true # see https://github.com/backup/backup/pull/606 for more information
  end

  ## 
  # Archive our app
  #
  archive :app_archive do |archive|
    archive.use_sudo
    archive.add '/home/deploy/MYAPP/'
  end

  ##
  # Store on Amazon S3
  #
  store_with S3 do |s3|
    s3.access_key_id = "ACCESS_KEY_ID"
    s3.secret_access_key = "SECRET_ACCESS_KEY"
    s3.region = "us-west-2"
    s3.bucket = "BUCKET_NAME"
    s3.path = "/production/database"
  end

  ##
  # Mail [Notifier]
  #
  notify_by Mail do |mail|
    mail.on_success           = false
    #mail.on_warning           = true
    mail.on_failure           = true

    mail.from                 = "no-reply@MYDOMAIN.COM"
    mail.to                   = "YOUR_EMAIL_ADDRESS"
    mail.address              = "smtp.mandrillapp.com"
    mail.port                 = 587
    mail.domain               = "YOUR_DOMAIN"
    mail.user_name            = "YOUR_SMTP_USERNAME"
    mail.password             = "YOUR_SMTP_PASSWORD"
    mail.authentication       = "login"
    mail.encryption           = :starttls
  end

end

What this will do is setup a backup for MySQL to be dumped and archive your Rails app (including any file uploads that are stored locally). All of this is saved to remotely on Amazon S3 and then it will email us if anything goes wrong.

You can also set on_success to true if you want to get emails on successful backups.

There's a lot of Documentation on the Backup gem if you want to make any modifications. They also support different databases like PostgreSQL, Mongo, Redis, many other storage locations and notification methods.

Test your Backup script

This is easy. Just run the following in your terminal. It will automatically detect the script in the models folder and run it for you.

backup perform -t production_backup

You should get some output telling you what the Backup is doing and, if you configured everything correctly, it will succeed and you will have a new file stored in your S3 bucket.

Schedule Your Backups

Now we need to make this script run every hour. We're going to use Cron to do this.

So first, let's grab the executable path:

which backup
# /usr/local/bin/backup

Take note of the output of this command because you're going to use it next.

And now let's type crontab -eto setup our Cron job. Add a line at the bottom that looks like the following:

0 * * * * /bin/bash -l -c '/usr/local/bin/backup perform -t production_backup'

Be sure to replace /usr/local/bin/backup with the output of the which backup command. This will make sure the Cron job can find the executable for Backup so that it can run your script.

This line basically tells Cron to perform the backup at the top of every hour. The cron format can be kind of hard to read, so if you'd like to learn more about it, take a look here.

If editing the crontab by hand doesn't strike your fancy, you can also use the Whenever gem to manage your backup Cron job instead. This does let you save your cron jobs into your Git repo so you can manage them easily. Definitely worth checking out.

Conclusion

With the Backup gem, it's super simple to backup your application and production data. Since your code is stored in a Git repository (I hope!) you don't need to worry about it too much. But when it comes to production data, you want to make sure it is backed up regularly, on a schedule, and saved to a remote machine in case anything happens to the server. The Backup gem helps you through a whole lot of this process with relative ease and a great amount of documentation.

Always be sure to test your backups and make sure you can safely restore from them!

Want to stay up-to-date with Ruby on Rails?

Join 85,376+ developers who get early access to new tutorials, screencasts, articles, and more.

    We care about the protection of your data. Read our Privacy Policy.