There are many backup tools around and many ways how to use them. For example, it is possible to use gzip and ftp to make a local copy of your web site. This approach have couple drawbacks such us: data are transferred over the internet unencrypted and we are most likely transferring data which we had copied over the day before.
To solve an unencrypted transfer problem we can instead of ftp use scp. However, this time the transfer time will be even longer as scp will create an extra overhead of creating encrypted tunnel for our backup internet connection. To stop transferring a duplicate data we can use rsync. If we combine rsync with ssh, compression, bash and cron we can end up with a ultimate backup tool.
Let's create simple but powerful backup solution using rsync, ssh, compression and cron scheduler:
At this point we need to create passwordless ssh login. By doing this we can avoid the need of entering password when doing our backup. This way we can make the whole backup process completely automatic. Please follow this tutorial to make ssh login to your server without password.
If you have not done so yet install rsync tool as a root user:
For Ubuntu, Debian enter as:
# apt-get install rsync
And Fedora, RHEL and CentOS:
# yum install rsync
In case that your website is using database such as mysql we first need to make a database backup. Therefore. our backup bash script starts with following lines:
#!/bin/bash # create database backup /usr/bin/ssh firstname.lastname@example.org '( mysqldump --password='pass' \ mydatabase > ~/public_html/mywebsite/mydatabase.sql )'
At this point the script will remotely execute mysqldump command over ssh to make a database backup stored in a website's root directory. Remote directory backup Next, we will add a rsync line to make an exact copy of our remote ~/public_html/mywebsite/ directory:
#!/bin/bash # create database backup /usr/bin/ssh email@example.com '( mysqldump --password='pass' \ mydatabase > ~/public_html/mywebsite/mydatabase.sql )' /usr/bin/rsync -zave ssh --delete\ firstname.lastname@example.org:~/public_html/mywebsite /backup/local-copy
At this point the script will create a local copy of a remote ~/public_html/mywebsite directory and store it in /backup/local-copy . The --delete option will ensure to delete all files from a local directory which no longer exist in a remote source directory thus keeping both directories in complete sync. rsync's -z option ensures a compression during transfer.
We are ready to test our new backup script:
$ chmod 700 backupscript.sh $ ./backupscript.sh
If everything went well we can schedule to run this backup script everyday at 02:00 using rsync. Open up rsync editor with
$ crontab -e
and add a following line to start this script everyday at 2AM:
00 02 * * * /path/to/backupscript.sh