Data backup and restore policies are critical to any business operations. Some hosting providers offer complementary emergency backup service for their clients or offer backup storage at additional cost. This is great to have in case of emergency as those can be restored usually in the least amount time needed. However often (but not always) being in same physical location as servers not providing any needed physical redundancy. Sometimes complementary backups do require additional configuration so don’t assume because web host offer says they’re included on our invoice you have all the backups you need. Hosting companies emergency backups are usually done less frequently, once a week seems to be a norm which can be a problem when latest backup for restore is 6 days old. Many times those emergency backups work on large data sets/entire virtual disk images thus making it harder to get a consistent point in time backup without causing service disruption to server thus posing a challenge in their consistency and reliability if its restore is even possible in the first place. If your server is terminated because of some billing or abuse issue chances are backups is as well, if a fire or natural disaster strikes the data center, hosting account or company is compromised and all your server data and backups gone or is hit with ransomware are you going to be able to recover and bring things back online? Do you think relying on hosting provider backup is sufficient?
We strongly encourage setting up and test proper backup and recovery policies in place. One of the simplest solutions to ensure reliable automated backup with redundancy anyone can implement them selfs in minutes is to periodically perform backup and sync to FTP/SFTP account in a separate location for redundant storage. A low budget option could be using another server you already have elsewhere with some extra space or purchase FTP backup storage space from one of many providers out there. The advantage of this solution is its simplicity and availability that you can access your data from anywhere anytime and don’t need a client software/agent software/license like many incremental backups solutions do in order to access the data in case the need of restore on a new server. The disadvantage of FTP backups is that it consumes more network bandwidth over incremental backups solutions thus SFTP option is recommended to maximize bandwidth efficiency with this method through more efficient synchronization process with rsync. Many servers hosting control panel like cPanel for example run on servers already have built-in intuitive options for enabling automated FTP/SFTP backups on the servers all you need to do is enabled them set credentials and server for remote backup storage schedule and what to back up and be done in seconds. If you’re not running any control panels it’s just as quick to setup as well with the little help of the simple and configurable FTP backup shell script we include below. It stores backup locally on the server (for fastest restore if still possible in some cases) and syncs it to the offsite storage location, it can be easily tuned to only do one of the two. other budget storage options could also be used instead of FTP/SFTP like S3, Dropbox for example. Script relays on OS packages to perform database backups, compression and FTP or SFTP sync thus we need to ensure we have those installed for it to work correctly, this can be simply done with following with keeping in mind lftp is only needed for FTP storage type and sshpass and rsync for SFTP storage type. The script will run on any major Linux/BSD distributions.
CentOS/RedHat
Debian/Ubuntu
Backups script needs to be placed on the server with correct permissions first and configured for operations, it contains sensitive information like user credentials and needs root level access to run to ensure all access to files thus /root/bin/backup.sh might be a good choice.
backup.sh - Offsite FTP/SFTP backup script
Configuration is straightforward and limits to changing the configuration variables in the script as explained in comments. Script contains sensitive information thus it’s critical to ensure correct permissions for it.
This would be a good time to run the script by hand and check everything is being backed up and uploaded to remote location correctly.
To ensure automatic execution server needs a cron job added as well, below is a daily backup at 3:30 AM example:
There are better, more advanced systems that include features like incremental backup, data deduplication, encryption, bare metal restore and much more. Most backup solutions are reasonably priced depending on the end user needs. Main factors to consider when choosing offsite backup provider are the amount of data to be stored, how long will it be kept for, what’s the acceptable recovery time? We advise against completely relying on hosting provider for backups. those are nice to have however to provide sufficient level of redundancy allowing to recover from data loss prompts for multiple backups in separate physical locations. Keeping a local copy of data on a separate computer also might be acceptable for some as well. Never the less always ensure sufficient backup of critical data so that you have access to it at all times.