There are many backup tools around and many ways how to use them. For example, it is possible to use gzip and ftp to make a local copy of your web site. This approach have couple drawbacks such us: data are transferred over the internet unencrypted and we are most likely transferring data which we had copied over the day before.

To solve an unencrypted transfer problem we can instead of ftp use scp. However, this time the transfer time will be even longer as scp will create an extra overhead of creating encrypted tunnel for our backup internet connection. To stop transferring a duplicate data we can use rsync. If we combine rsync with ssh, compression, bash and cron we can end up with a ultimate backup tool.

Let's create simple but powerful backup solution using rsync, ssh, compression and cron scheduler:

1. Passwordless ssh

At this point we need to create passwordless ssh login. By doing this we can avoid the need of entering password when doing our backup. This way we can make the whole backup process completely automatic. Please follow this tutorial to make ssh login to your server without password.

2. rsync installation

If you have not done so yet install rsync tool as a root user:
For Ubuntu, Debian enter as:

# apt-get install rsync

And Fedora, RHEL and CentOS:

# yum install rsync

3. Making a database backup

In case that your website is using database such as mysql we first need to make a database backup. Therefore. our backup bash script starts with following lines:

# create database backup 
/usr/bin/ssh  '( mysqldump --password='pass' \  
mydatabase > ~/public_html/mywebsite/mydatabase.sql )'

At this point the script will remotely execute mysqldump command over ssh to make a database backup stored in a website's root directory. Remote directory backup Next, we will add a rsync line to make an exact copy of our remote ~/public_html/mywebsite/ directory:

# create database backup 
/usr/bin/ssh  '( mysqldump --password='pass' \  
mydatabase > ~/public_html/mywebsite/mydatabase.sql )' 
/usr/bin/rsync -zave ssh --delete\ 
 :~/public_html/mywebsite /backup/local-copy

At this point the script will create a local copy of a remote ~/public_html/mywebsite directory and store it in /backup/local-copy . The --delete option will ensure to delete all files from a local directory which no longer exist in a remote source directory thus keeping both directories in complete sync. rsync's -z option ensures a compression during transfer.

We are ready to test our new backup script:

$ chmod 700
$ ./

4. Doing backup with crontab

If everything went well we can schedule to run this backup script everyday at 02:00 using rsync. Open up rsync editor with

$ crontab -e

and add a following line to start this script everyday at 2AM:

00 02 * * * /path/to/

Free Linux eBooks

Do you have the right skills?

Our IT Skills Watch page reflects an up to date IT skills demand leaning towards the Linux and Unix environment. We have considered a number of skills and operating systems.

See the result...

Linux Online Training

Learn to run Linux servers and prepare for LPI certification with Linux Academy. 104 available video lessons with PDF course notes with your own server!

Go to top