Simple offsite FTP/SFTP backups script

Simple offsite FTP/SFTP backups script

Data backup and restore policies are critical to any business operations. Some hosting providers offer complementary emergency backup service for their clients or offer backup storage at additional cost. This is great to have in case of emergency as those can be restored usually in the least amount time needed. However often (but not always) being in same physical location as servers not providing any needed physical redundancy. Sometimes complementary backups do require additional configuration so don’t assume because web host offer says they’re included on our invoice you have all the backups you need. Hosting companies emergency backups are usually done less frequently, once a week seems to be a norm which can be a problem when latest backup for restore is 6 days old. Many times those emergency backups work on large data sets/entire virtual disk images thus making it harder to get a consistent point in time backup without causing service disruption to server thus posing a challenge in their consistency and reliability if its restore is even possible in the first place. If your server is terminated because of some billing or abuse issue chances are backups is as well, if a fire or natural disaster strikes the data center, hosting account or company is compromised and all your server data and backups gone or is hit with ransomware are you going to be able to recover and bring things back online? Do you think relying on hosting provider backup is sufficient?

We strongly encourage setting up and test proper backup and recovery policies in place. One of the simplest solutions to ensure reliable automated backup with redundancy anyone can implement them selfs in minutes is to periodically perform backup and sync to FTP/SFTP account in a separate location for redundant storage. A low budget option could be using another server you already have elsewhere with some extra space or purchase FTP backup storage space from one of many providers out there. The advantage of this solution is its simplicity and availability that you can access your data from anywhere anytime and don’t need a client software/agent software/license like many incremental backups solutions do in order to access the data in case the need of restore on a new server. The disadvantage of FTP backups is that it consumes more network bandwidth over incremental backups solutions thus SFTP option is recommended to maximize bandwidth efficiency with this method through more efficient synchronization process with rsync. Many servers hosting control panel like cPanel for example run on servers already have built-in intuitive options for enabling automated FTP/SFTP backups on the servers all you need to do is enabled them set credentials and server for remote backup storage schedule and what to back up and be done in seconds. If you’re not running any control panels it’s just as quick to setup as well with the little help of the simple and configurable FTP backup shell script we include below. It stores backup locally on the server (for fastest restore if still possible in some cases) and syncs it to the offsite storage location, it can be easily tuned to only do one of the two. other budget storage options could also be used instead of FTP/SFTP like S3, Dropbox for example. Script relays on OS packages to perform database backups, compression and FTP or SFTP sync thus we need to ensure we have those installed for it to work correctly, this can be simply done with following with keeping in mind lftp is only needed for FTP storage type and sshpass and rsync for SFTP storage type. The script will run on any major Linux/BSD distributions.

CentOS/RedHat

yum install -y sshpass rsync mysql-client

Debian/Ubuntu

apt-get install sshpass rsync mysql-client lftp

Backups script needs to be placed on the server with correct permissions first and configured for operations, it contains sensitive information like user credentials and needs root level access to run to ensure all access to files thus /root/bin/backup.sh might be a good choice.

backup.sh - Offsite FTP/SFTP backup script

#!/bin/bash
# https://skylineservers.com
  
# CONFIGURATION START
RETENTION=7 # how many days back to store the backup for
WHAT_TO_BACKUP="/home /etc /root" # directories on the server to be included in backup archive
MYSQLBACKUP=1 # 0/1 disable/enable mysql backup for all databases
DBUSER="username" # mysql username used for bacup
DBPASSWORD="password" # mysql user password used for backup
DBSERVER="127.0.0.1" # mysql server ip/hostname
TYPE=SFTP # offsite backup type possible options are FTP and recomended SFTP
SERVER="server.com" # offsite storage server hostname/ip
USERNAME="username" # offsite storage server username
PASSWORD="password" # offsite storage server user password
SFTPD_PORT="22" # option for specyfing custom port for SSH on the offsite servver
BACKUPDIR=/backup # local server directory to keep local backups in for fast restore
# CONFIGURATION STOP
   
DATESTAMP=`date +%d%m%y`
 
# create directory structure
/bin/mkdir -p ${BACKUPDIR}/${DATESTAMP} &&
  
if [ $MYSQLBACKUP = 1 ];then
 for i in $(mysql -u $DBUSER -p$DBPASSWORD -h $DBSERVER -Bse 'show databases;' | grep -v _schema) ; do
  mysqldump -u $DBUSER -p$DBPASSWORD -h $DBSERVER $i | gzip -c > ${BACKUPDIR}/${DATESTAMP}/$i-${DATESTAMP}.sql.gz
 done ;
fi
  
if [ "$WHAT_TO_BACKUP" ]; then
    tar -zcf ${BACKUPDIR}/${DATESTAMP}/backup-${DATESTAMP}.tgz $WHAT_TO_BACKUP;
fi
  
if [ $TYPE == "FTP" ];then
    lftp -q -u "${USERNAME},${PASSWORD}" $SERVER -e "set ftp:ssl-protect-data true; mkdir -p ${BACKUPDIR}/${DATESTAMP}; mirror --parallel=3 -n -R ${BACKUPDIR}/ ${BACKUPDIR}/; exit" 2>>/var/log/backupscript.log
elif [ $TYPE == "SFTP" ]; then
    rsync --rsh="sshpass -p $PASSWORD ssh -p $SFTPD_PORT -o StrictHostKeyChecking=no -l $USERNAME" -r --delete ${BACKUPDIR}/ $SERVER:~${BACKUPDIR}/
fi
  
find $BACKUPDIR/* -type d -mtime +$RETENTION -exec sh -c 'echo rm -rf "$1" | sh' -- {} \;

Configuration is straightforward and limits to changing the configuration variables in the script as explained in comments. Script contains sensitive information thus it’s critical to ensure correct permissions for it.

chmod 700 backup.sh

This would be a good time to run the script by hand and check everything is being backed up and uploaded to remote location correctly.

To ensure automatic execution server needs a cron job added as well, below is a daily backup at 3:30 AM example:

crontab -e
30 3 * * * /root/bin/backup.sh

There are better, more advanced systems that include features like incremental backup, data deduplication, encryption, bare metal restore and much more. Most backup solutions are reasonably priced depending on the end user needs. Main factors to consider when choosing offsite backup provider are the amount of data to be stored, how long will it be kept for, what’s the acceptable recovery time? We advise against completely relying on hosting provider for backups. those are nice to have however to provide sufficient level of redundancy allowing to recover from data loss prompts for multiple backups in separate physical locations. Keeping a local copy of data on a separate computer also might be acceptable for some as well. Never the less always ensure sufficient backup of critical data so that you have access to it at all times.

Share