How to periodically Backup your website and MySQL database to local server as well as S3 bucket using Bash script (video)

You might have gone through situation where you need simple script to backup your website and database server. In this article, I will explain how you can periodically Backup your MySQL database and website to local server as well as S3 bucket using Bash script. The reason I am writing the backup script in bash is that, bash is flexible to handle and we don’t need any other programming language where bash excels.

Video Demonstration:

This is a video demonstration of this article. Please Subscribe , like  and comment  🙂

[do_widget id=custom_html-12]

Step1: Let’s write the bash script:

Let’s write the backup script. You don’t have to write one, I have written one for you 🙂 . create a file backup.sh  in your home directory and save the code given below(choose the one as per your need). Generally this script create tar of your website directory and database dump. The location of the website content and database details should be given to variables webapp_path  and database_* respectively. The backups are created at directory mentioned by variable backup_dir  with retention period retention_days .

Backup your website and MySQL database to local server only:

This script backups your website and database to local server only, not to s3 bucket.

#!/bin/bash

###################################
backup_dir="/home/ubuntu/backup/"
webapp_path="/var/www/html"
database_name="testdb"
database_user="dbuser"
database_pwd="password"
database_host="db_host_name.net"
retention_days=10
##################################

date=`date +%d-%m-%y`
path="$backup_dir$date"
echo $date
mkdir -p $path > /dev/null 2>&1
if [ $? -eq 0 ]; then
    echo "-Successfully  created directory $path"

    mysqldump -u $database_user -p$database_pwd -h $database_host $database_name > $path/db_backup.sql

    if [ $? -eq 0 ]; then
        echo "-Successfully created database dump"

        tar -czvf  $path/backup_with_db.tar.gz $webapp_path $path/db_backup.sql > /dev/null 2>&1

        if [ $? -eq 0 ]; then
                echo "-Successfully completed file + db  backup process"
		rm -rf $path/db_backup.sql

                old_date=`date --date="$retention_days day ago" +%d-%m-%y`
                old_path="$backup_dir$old_date"

                ls $old_path > /dev/null 2>&1
                if [ $? -eq 0 ]; then
                        rm -rf $old_path > /dev/null 2>&1
			if [ $? -eq 0 ]; then
                        	echo "-Sucessfully removed old backup on $old_date"
                	else
                        	echo "-Failed old backup removal $old_path" && exit 1
                	fi
		fi

        else
                echo "-Failed file +db backup process" && exit 1
        fi

    else
        echo "-Failed creating database dump, backup process failed" && exit 1
    fi

else
    echo "-Failed creating directory $path, backup process failed" && exit 1
 
fi

 

Backup your website and MySQL database to local server as well as s3 bucket:

This script backup your website and database to local server as well as s3 bucket. You need to configure awscli  on your server before running this script. Learn how you can configure AWSCLI from here >>

Mention the S3 bucket name in variable s3_bucket_name  where you want to backup the data.

.

#!/bin/bash

###################################
backup_dir="/home/ubuntu/backup/"
webapp_path="/var/www/html"
database_name="testdb"
database_user="dbuser"
database_pwd="password"
database_host="db_host_name.net"
s3_bucket_name="s3 bucket name"
retention_days=10
##################################

date=`date +%d-%m-%y`
path="$backup_dir$date"
echo $date
mkdir -p $path > /dev/null 2>&1
if [ $? -eq 0 ]; then
    echo "-Successfully  created directory $path"

    mysqldump -u $database_user -p$database_pwd -h $database_host $database_name > $path/db_backup.sql

    if [ $? -eq 0 ]; then
        echo "-Successfully created database dump"

        tar -czvf  $path/backup_with_db.tar.gz $webapp_path $path/db_backup.sql > /dev/null 2>&1

        if [ $? -eq 0 ]; then
                echo "-Successfully completed file + db  backup process"
		rm -rf $path/db_backup.sql

		aws s3 sync $backup_dir s3://$s3_bucket_name/backup/ --delete
		if [ $? -eq 0 ]; then
			echo "AWS syncing completed"
		else
			echo "AWS syncing failed, but backup still present in the local server" && exit 1
		fi

                old_date=`date --date="$retention_days day ago" +%d-%m-%y`
                old_path="$backup_dir$old_date"

                ls $old_path > /dev/null 2>&1
                if [ $? -eq 0 ]; then
                        rm -rf $old_path > /dev/null 2>&1
			if [ $? -eq 0 ]; then
                        	echo "-Sucessfully removed old backup on $old_date"
                	else
                        	echo "-Failed old backup removal $old_path" && exit 1
                	fi
		fi

        else
                echo "-Failed file +db backup process" && exit 1
        fi

    else
        echo "-Failed creating database dump, backup process failed" && exit 1
    fi

else
    echo "-Failed creating directory $path, backup process failed" && exit 1
fi

Step2: Set a cronjob to run the bash script:

So we have the script ready. Let’s run it daily by setting cron job. Run command crontab -e  and append line shown below.

0 1 * * * bash /path_to_the_script/backup.sh > /path_to_the_log/backup.log 2>&1

Here we have scheduled the cronjob to run the script daily at 1AM. The script logs both stdout and stderr at  /path_to_the_log/backup.log so that you can check the file incase of any failure.

Ideally the log file would look like this.

30-01-19
-Successfully  created directory /home/ubuntu/backup/30-01-19
mysqldump: [Warning] Using a password on the command line interface can be insecure.
-Successfully created database dump
-Sucessfully completed file + db  backup process
upload: backup/29-01-19/backup_with_db.tar.gz to s3://easyaslinux/29-01-19/backup_with_db.tar.gz
upload: backup/30-01-19/backup_with_db.tar.gz to s3://easyaslinux/30-01-19/backup_with_db.tar.gz
AWS syncing completed

 

If you need a mail  notification when the backup script is failed, modify your cronjob as below.

0 1 * * * bash /path_to_the_script/backup.sh > /path_to_the_log/backup.log 2>&1 || mail -s "Backup script has failed" -m something@gmail.com < /path_to_the_log/backup.log'

 

If you get following error in the log file even after AWSCLI  is configured, then it means you don’t have AWSCLI  in the environment path $PATH of crontab.

aws: command not found

The simple fix is, get the path of AWSCLI.

# which aws
/usr/local/bin/aws

And add the path PATH=$PATH:/usr/local/bin/  in the beginning of the backup script(after shebang).

You can see the backups created in the s3 bucket by running below command.

aws s3 ls s3://s3_bucket_name/backup

Thanks for the time taken to read my blog. Subscribe to this blog so that you don’t miss out anything useful (Checkout Right Sidebar for the Subscription Form) . Please also put your thoughts on this article as  comments .

 

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top
x