Shell script to backup websites / mysql database

So it has been awhile since my last update. I recently moved my web server hosting back and forth between linode and locally hosting over the last two months. At the end of the day, I ended up moving my Unraid server to a colo located in Pittsburgh for a really good monthly rate which hosts my website among other things.

To move my website back and forth with ease, I needed to make up a shell script to backup mysql database and website files to a zipped tar file. Behold… my script!

#!/bin/bash
export PATH=/bin:/usr/bin:/usr/local/bin

# Update the below variables for your situation
DB_BACKUP_PATH='BACKUPDIR_FOR_MYSQL'
MYSQL_HOST='localhost'
MYSQL_PORT='3306'
MYSQL_USER='MYSQL_USER'
MYSQL_PASSWORD='MYSQL_PASSSWORD'
DATABASE_NAME='MYSQL_DATABASE_NAME'
BACKUP_RETAIN_DAYS=20   ## Number of days to keep local backup copy
BACKUP_DIR="BACKUPDIR_FOR_WEBSERVER_FILES"
TODAY=`date +"%d%b%Y"`

# Mysql Backup
mkdir -p ${DB_BACKUP_PATH}/${TODAY}
echo "Backup started for database - ${DATABASE_NAME}"

mysqldump -h ${MYSQL_HOST} \
-P ${MYSQL_PORT} \
-u ${MYSQL_USER} \
-p${MYSQL_PASSWORD} \
${DATABASE_NAME} | gzip > ${DB_BACKUP_PATH}/${TODAY}/${DATABASE_NAME}-${TODAY}.sql.gz

if [ $? -eq 0 ]; then
        echo "Database backup successfully completed"
        else
                echo "Error found during backup"
                exit 1
                fi


##### Mysql - Remove backups older than {BACKUP_RETAIN_DAYS} days  #####

                DBDELDATE=`date +"%d%b%Y" --date="${BACKUP_RETAIN_DAYS} days ago"`

                if [ ! -z ${DB_BACKUP_PATH} ]; then
                        cd ${DB_BACKUP_PATH}
                        if [ ! -z ${DBDELDATE} ] && [ -d ${DBDELDATE} ]; then
                                rm -rf ${DBDELDATE}
                                fi
                                fi


# Website File Backup
# To create a new directory into backup directory location.
sudo mkdir -p $BACKUP_DIR/$TODAY

# take each website backup in separate name, use below format. Make sure you update BACKUPNAME.
sudo tar -zcvpf $BACKUP_DIR/BACKUPNAME-$TODAY.tar.gz /var/www/
sudo rm -r $BACKUP_DIR/$TODAY

# Delete files older than 20 days
sudo find $BACKUP_DIR/* -mtime +20 -exec rm -r {} \;

So this is a Frankenstein of two different scripts I found online so yeah, I could of cleaned up the variables but this is working fine for me at this time and if it works, it works I guess.

So what this script does is take the SQL database .sql file and gunzip/tar it in the “BACKUP_DIR” variable. It does the same for the /var/www directory and then it gunzip/tar the directory into the backup directory. I then added this shell script to crontab to automate this backup process.

The backup directory is saved to my Unraid share and also rclone synced up to google drive so I have a off site copy if my server decides to go belly up. Remember 2 is 1 and 1 is none!

Hopefully this script helps to point you in the right direction on how to back up your wordpress or website.