BackupBuddy and Amazon S3


BackupBuddy, the popular WordPress plugin to back up your WP site, has an option to back up to Amazon S3 storage.  S3 storage is a great solution because it’s cheap, easy to use, and can be secure if you set it up right.  Unfortunately I couldn’t find any decent instructions on how to set up S3 for BackupBuddy.  The few references I found only showed how to set up wide-open S3 access, with a disclaimer this it was not very secure.

We do a fair amount of work with S3 storage here so I worked out the details.  These are my recommendations for setting up your S3 access credentials on the Amazon AWS side. Note that this is based on WP version 4.2.2 and BackupBuddy 6.0.1.2.  These recommendations may not apply to other versions.

Your AWS account and IAM

First you have to have an AWS account, of course.  You can sign up here.  In case you don’t notice in all of the agreements and terms of service, someone gaining access to your root AWS account could not only mess with all your AWS resources, but could spend a lot of your money.  (That’s why Amazon recommends securing the login and actually deleting the access keys for your root account!)  What you need is to set up an “IAM” account, which is sort of a proxy for your root account, except that you can limit what the account can do.  Read about IAM here.

Bottom line is that in BackupBuddy’s “Remote Destinations” setup, you should never use your AWS root account access keys.  You really should create an IAM account with limited privileges and use the IAM account’s access keys to do your backups. 

Setting up minimum privileges for your IAM user

Amazon S3 objects (for example, your backups) are saved into “buckets.”  Access to each bucket may be controlled individually, and that’s what I recommend:  each site you’re backing up should have its own S3 bucket, and should have privileges ONLY for that bucket.  I also recommend you create your S3 buckets manually using the AWS console so you don’t have to give your IAM user the ability to create buckets.  Then the minimum s3 privileges your IAM user needs are:

  • s3:ListBucket (for your bucket)
  • s3:PutObject (for items in your bucket)
  • s3:GetBucketLocation (for your bucket)

I don’t think you really need s3:GetBucketLocation to run BackupBuddy backups, but if you don’t include it, you’ll get an error when you click the “test” button on BackupBuddy’s Remote Destinations page, and the test button is your friend since it can save you tons of time troubleshooting.  (Note: the test button works by trying to create a small test file in your S3 bucket, but it won’t be able to delete it because of the tight rules I’m recommending.  Don’t worry – that’s not a problem.)

Setting up a limited-access policy can be tricky.   In the IAM Management Console, create your IAM user, then click “Create User Policy” and enter this JSON policy code (using your bucket name, of course.)

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::<bucket_name_here>"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject"
            ],
            "Resource": [
                "arn:aws:s3:::<bucket_name_here>/*"
            ]
        }
    ]
}

Notice that the s3:ListBucket and s3:GetBucketLocation actions are allowed for the bucket, but the s3:PutObject action must be allowed for items inside the bucket, indicated by the trailing “/*”.  For security reasons, this policy prevents reading or deleting items once they are created in the bucket.

Deleting old backups automatically

Since the policy prevents deleting existing backups, BackupBuddy won’t be able to limit the number of backups by deleting old ones.  It’s best to avoid any settings that will make it try.

For deleting (or archiving) old backups, the Amazon S3 “Lifecycle Management” rules work well and are easy to set up.  For example you can set up a rule to delete backups older than 90 days, and AWS will take care of it for you.  Setting up a rule to archive some or all of your backups to “Glacier” storage can provide additional security.

, ,