Automatic backup on Linux webserver (part 1)

For the past 2 years or so, I’ve been using a Linux server from Virmach to host my websites. A VPS like this is a bit more difficult to maintain than the web server from your regular hosting company, although it’s much more configurable. You can use it for multitudes of other purposes like setting up VPN, pihole, remote development, learning linux, you name it.

However, maintaining your own server can be quite a hassle. For example backing up your webserver data. In cPanel, you can easily use the Backup menu and you’re done in no time. But it’s not that easy (yet) with a VPS.

This time in part 1 we will learn how to backup the webserver files and databases using the command line and schedule it via cronjob. I’m using Debian 9, Apache 2, and a MySQL database. In part 2 we’ll use another computer to store the backup elsewhere.

Steps to take:

  1. Backup files
  2. Backup database
  3. Automatic schedule via cronjob

Backup files

Backing up your webserver files can be done easily with a tar command. We’ll also compress the result with gzip compression.

Basically the command looks like this:
sudo tar -zvcf outputFile.tar.gz inputFolder

And since I want to have the dates automatically appended in the filename, I can use this format to replace the outputFile.tar.gz above: $(date '+%Y-%m-%d_backup-files.tar.gz')

So in my case, I can run it to backup the whole /var/www/html folder with today’s date appended to the resulting file
sudo tar -zvcf $(date '+%Y-%m-%d_backup-files.tar.gz') /var/www/html

It works fine but there is an inconvenient problem with this approach. When you try to open the files, the archive includes the whole folder structure by default (var > www > html) instead of only the html folder. To circumvent this, we can use -C option to change the directory first. So the folder part becomes -C /var/www html . This option basically changes the directory to /var/www first before executing the tar at html folder.

Let’s put this command into a file called backup-files.sh and don’t forget to make it executable by running $ chmod +x backup-files.sh in your terminal. Later, you’ll only need to run ./backup-files.sh when you want to backup the all the files in the www directory.

Backup database

To backup your mysql database using command line, you just need this command:

mysqldump -u yourUsername -pYourPassword --all-databases=true| gz > outputfile.sql.gz

The command above is pretty straightforward. You just need to replace yourUsername with your database username, YourPassword with your db password, and outputFile to the actual name. Again, let’s replace outputFile.sql.gz with the date when the backup is executed: $(date '+%Y-%m-%d_backup-db.sql.gz'). In the command above we also piped the mysqldump result to gz to compress the outputted file.

Again, let’s put this command into a file called backup-db.sh and don’t forget to make it executable by running $ chmod +x backup-db.sh in your terminal.

Setting up a cronjob

Great! So far you have two scripts that will back up your files or database by executing a single script file. Pretty neat, right?

But what if you’re lazy like me? Having to login to the machine and execute a file is a quite inconvenient. It’d be much more convenient and preferable if the backup runs automatically at a regular interval. Well, setting up a cron job is perfect to do just this.

First of all, we’d need to create or edit a crontab (cron table). Simply run crontab -e on your terminal.

Then you will be able to edit the crontab file. Please consult the crontab documentation or use crontab guru to determine the interval. In my case, I’d like to run the database backup more often than the file backup. So I’m setting the cronjob to backup my files every sunday at 1 pm:

0 1 * * 0 /path/to/backup-file.sh

For backing up the database, I will set it to every 3 days:

0 0 */3 * * /path/to/backup-database.sh

Done!

Congratulations! Now you have an automated backup system to make sure your system always be backed up at all time. If you’re satisfied with this approach, remember to download the backup manually. Since I’m a lazy person, I won’t remember to download them manually so in the next part we will setup another computer to download them at a regular interval too (and make it more secure).

Comment