How to Automate VPS Backups to an External Server or Amazon S3 Bucket

Tips4~days ago~Published LetsHosting
123 0
How to Automate VPS Backups to an External Server or Amazon S3 Bucket
Backing up your Virtual Private Server (VPS) is essential to protect your data from hardware failures, software issues, or accidental deletions. Automating this process ensures consistency and eliminates the risk of forgetting to perform manual backups. In this detailed guide, you’ll learn how to automate VPS backups to two destinations: an external server (such as another VPS or a NAS) and an Amazon S3 bucket (a scalable cloud storage solution). We’ll cover the setup, scripting, scheduling, and additional considerations like encryption and retention.

Introduction

A Virtual Private Server (VPS) is a virtualized machine provided by hosting companies, offering the flexibility of a dedicated server at a lower cost. Whether you’re hosting websites, applications, or databases, regular backups are critical to safeguard your data. Automating backups means scheduling them to run at intervals without manual intervention, ensuring your data is always protected.
This guide provides step-by-step instructions to:
  1. Back up your VPS to an external server using SSH for secure file transfers.
  2. Back up your VPS to an Amazon S3 bucket using the AWS Command Line Interface (CLI).
  3. Enhance your backup process with encryption, retention policies, and verification.
We’ll focus on full backups for simplicity, though advanced users may explore incremental or differential backups for efficiency.

Prerequisites

Before you begin, ensure you have:
  • A VPS running a Linux distribution (e.g., Ubuntu, CentOS).
  • An external server with SSH access or an AWS account with an S3 bucket created.
  • Basic knowledge of Linux commands and shell scripting.
  • Root or sufficient permissions to install software and configure cron jobs on your VPS.
  • Optional: A database (e.g., MySQL) if you need to back up database content.

Backing Up to an External Server

In this section, we’ll set up automated backups to an external server using SSH for secure file transfers.

Step 1: Setting Up SSH Access

To transfer files securely, configure SSH key-based authentication between your VPS and the external server.
  1. Generate an SSH key pair on your VPS (skip if already done):
    bash
    ssh-keygen -t rsa -b 4096
    Press Enter to accept the default file location (/root/.ssh/id_rsa) and optionally set a passphrase.
  2. Copy the public key to the external server:
    bash
    ssh-copy-id user@external_server
    Replace user@external_server with your username and the external server’s IP address or hostname.
  3. Test the SSH connection:
    bash
    ssh user@external_server
    You should log in without a password prompt. If successful, proceed.

Step 2: Creating the Backup Script

We’ll create a script to archive your data and transfer it to the external server.
  1. Create a temporary backup directory:
    bash
    mkdir /backup
  2. Create the backup script: Open a file (e.g., /usr/local/bin/backup_to_external.sh) in a text editor like nano or vim:
    bash
    nano /usr/local/bin/backup_to_external.sh
    Add the following content:
    bash
    # Backup script for VPS to external server 
    
    # Variables BACKUP_DIR="/backup" DATE=$(date +%Y%m%d) ARCHIVE_NAME="vps_backup_$DATE.tar.gz" EXTERNAL_SERVER="user@external_server:/path/to/backup/dir" 
    
    # Dump databases (if applicable) mysqldump -u db_user -p'db_password' database_name > $BACKUP_DIR/db_dump.sql 
    
    # Create archive tar -czf $BACKUP_DIR/$ARCHIVE_NAME /var/www /etc $BACKUP_DIR/db_dump.sql 
    
    # Transfer to external server scp $BACKUP_DIR/$ARCHIVE_NAME $EXTERNAL_SERVER 
    
    # Clean up rm $BACKUP_DIR/$ARCHIVE_NAME $BACKUP_DIR/db_dump.sql
    Explanation:
    • BACKUP_DIR: Temporary directory for backup files.
    • DATE: Timestamp for unique backup filenames (e.g., vps_backup_20231015.tar.gz).
    • ARCHIVE_NAME: Name of the compressed backup file.
    • EXTERNAL_SERVER: Destination in the format user@host:/path.
    • mysqldump: Exports a MySQL database (skip or modify for other databases like PostgreSQL with pg_dump).
    • tar -czf: Archives and compresses the specified directories (/var/www, /etc) and database dump.
    • scp: Securely copies the archive to the external server.
    • rm: Removes temporary files after transfer.
  3. Customize the script:
    • Update EXTERNAL_SERVER with your server’s details.
    • Replace db_user, db_password, and database_name with your database credentials (skip if no database).
    • Adjust the directories to back up (e.g., /var/www for websites, /etc for configurations).
  4. Make the script executable:
    bash
    chmod +x /usr/local/bin/backup_to_external.sh
  5. Test the script:
    bash
    /usr/local/bin/backup_to_external.sh
    Verify that the backup file appears on the external server.

Step 3: Scheduling the Script

Use cron to automate the script execution.
  1. Open the crontab editor:
    bash
    crontab -e
  2. Add a schedule: For daily backups at 2 AM, add:
    bash
    0 2 * * * /usr/local/bin/backup_to_external.sh
    Save and exit. Cron syntax: minute hour day_of_month month day_of_week command.
Your backups will now run automatically every day at 2 AM.

Backing Up to an S3 Bucket

In this section, we’ll automate backups to an Amazon S3 bucket using the AWS CLI.

Step 1: Setting Up AWS CLI

  1. Install AWS CLI on your VPS:
    • For Ubuntu:
      bash
      sudo apt-get update
      sudo apt-get install awscli
    • For CentOS:
      bash
      sudo yum install awscli
  2. Configure AWS CLI: Run:
    bash
    aws configure
    Enter:
    • AWS Access Key ID and Secret Access Key (from your AWS IAM user).
    • Default region name (e.g., us-east-1).
    • Default output format (e.g., json).
  3. Verify the setup: List your S3 buckets to confirm:
    bash
    aws s3 ls

Step 2: Creating the Backup Script

  1. Ensure the backup directory exists:
    bash
    mkdir /backup
  2. Create the backup script: Open /usr/local/bin/backup_to_s3.sh:
    bash
    nano /usr/local/bin/backup_to_s3.sh
    Add:
    bash
    #!/bin/bash 
    
    # Backup script for VPS to S3 bucket 
    
    # Variables BACKUP_DIR="/backup" DATE=$(date +%Y%m%d) ARCHIVE_NAME="vps_backup_$DATE.tar.gz" S3_BUCKET="s3://your-bucket-name" 
    
    # Dump databases (if applicable) mysqldump -u db_user -p'db_password' database_name > $BACKUP_DIR/db_dump.sql 
    
    # Create archive tar -czf $BACKUP_DIR/$ARCHIVE_NAME /var/www /etc $BACKUP_DIR/db_dump.sql 
    
    # Upload to S3 aws s3 cp $BACKUP_DIR/$ARCHIVE_NAME $S3_BUCKET 
    
    # Clean up rm $BACKUP_DIR/$ARCHIVE_NAME $BACKUP_DIR/db_dump.sql
    Explanation:
    • S3_BUCKET: Your S3 bucket URL (e.g., s3://my-backup-bucket).
    • aws s3 cp: Uploads the archive to S3.
    • Other parts mirror the external server script.
  3. Customize the script:
    • Replace your-bucket-name with your S3 bucket name.
    • Update database credentials and directories as needed.
  4. Make it executable:
    bash
    chmod +x /usr/local/bin/backup_to_s3.sh
  5. Test the script:
    bash
    /usr/local/bin/backup_to_s3.sh
    Check your S3 bucket to confirm the file uploaded.

Step 3: Scheduling the Script

  1. Edit the crontab:
    bash
    crontab -e
  2. Schedule daily backups: Add:
    bash
    0 2 * * * /usr/local/bin/backup_to_s3.sh
Your VPS will now back up to S3 every day at 2 AM.

Additional Considerations

Encrypting Backups

For sensitive data, encrypt backups before transfer.
  1. Generate a GPG key:
    bash
    gpg --gen-key
    Follow prompts to create a key tied to your email.
  2. Modify the script: Replace the tar line with:
    bash
    tar -cz /var/www /etc $BACKUP_DIR/db_dump.sql | gpg --encrypt --recipient [email protected] > $BACKUP_DIR/$ARCHIVE_NAME.gpg
    Update the transfer/upload command to use $ARCHIVE_NAME.gpg.

Managing Backup Retention

To prevent storage overflow, delete old backups.
  • External Server: Add to the script:
    bash
    ssh user@external_server "find /path/to/backup/dir -type f -mtime +7 -delete"
    This removes files older than 7 days.
  • S3 Bucket: Use an S3 lifecycle rule via the AWS Console to delete objects older than 7 days, or script it with:
    bash
    aws s3 ls $S3_BUCKET | awk '{print $4}' | while read file; do [[ $(date -d "$(echo $file | cut -d_ -f3 | cut -d. -f1)" +%s) -lt $(date -d "7 days ago" +%s) ]] && aws s3 rm $S3_BUCKET/$file; done

Verifying Backups

Test your backups periodically:
  • Download a backup file.
  • Decrypt it (if encrypted) with gpg –decrypt.
  • Extract it with tar -xzf and restore to a test environment.
  • Ensure all data (files, databases) is intact.

Conclusion

Automating VPS backups to an external server or an S3 bucket ensures your data is consistently protected. With SSH for external servers and AWS CLI for S3, you’ve set up a reliable system that runs daily at 2 AM. Enhance security with encryption, manage storage with retention policies, and verify backups by testing restores.
This guide focuses on full backups for simplicity. For larger datasets, explore tools like duplicity or rsnapshot for incremental backups. Regularly test your restore process to confirm your backups are usable—peace of mind is worth the effort!

Related Posts