Automated site and database backups using PHP and cron

Using cron jobs to perform maintenance tasks becomes so much more inviting once you realize you don’t have to go out and learn PERL in order to make use of them. Any server with PHP installed can run any arbitrary PHP scripts you throw its way, and with the help of powerful functions like exec() and passthru() just about any administrative maintenance task can be carried out easily, including whole site and database backups.

I’ve been a nut about security and backups ever since a WordPress blog I worked on got hacked a few years ago, and we decided to shut it down rather than go through the process of standing it back up without any reasonably recent backups to use. Since then, backing up after major changes was always a must, but I wanted to be able to automate backups of both site files and databases on a nightly basis, just in case. This was a rather lofty goal on a relatively limited shared hosting plan, but lucky for me HostGator offers access to cron jobs with very few limitations.

The Backup

The backups are performed by running command line operations out of the PHP function exec(). Simply pass this function a string, and it will execute that string as a command line operation on the local machine. The PHP surrounding that call is mainly used for organization, and for creating a datestamp to distinguish each backup from the others.

The database is backed up using the mysqldump command, which backs up an entire database and all of its tables to a single .sql file. The output file from the mysqldump is then piped into a gzip command to compress the file.

The backup of the site files is performed by grouping the site’s root directory into a compressed tarball using the tar command with the -z flag set in order to use gzip compression on the resulting file. You could use the -j flag instead to use bzip compression, which would take longer to perform but generate a smaller file size. I noticed that the operation was already taking up to 5 minutes to complete using gzip, so I decided to stick with that instead of attempting bzip.

The Setup

There’s only one important piece of setup in this code, and that’s generating the datestamp to be used in both file names. This is a single line of code in which we store the current date as a string.

The Database

In order to use mysqldump to create a database backup, we have to pass it the credentials of a user who has access to the database, so the first thing we need to do is declare our database variables.

Once that’s prepared, we just need to select the location and file name we want for the backup file, and run the mysqldump command.

Note that there is no space between the “-p” flag and the single quoted password that follows it. Apparently the single quotes are only necessary if the password contains special characters, but there’s no harm in leaving them there regardless.

The Files

Backing up the site files is even easier, as you don’t need to worry about credentials for anything. In three steps we simply select the directory that we want to make a backup of, decide where we want to save that backup, and execute the tar command to create it.

Now just save these in a PHP file, and create a cron job that runs that file using the php command at regular intervals.

Automating Cleanup

Once I got all of this working, I realized that I might get into some trouble with my hosting company if I let these backups accumulate on their servers and start taking up an unreasonable amount of space. To combat this issue, I decided to use a similar system to delete old backup files from the server.

Of course, I didn’t want to lose those old backups completely, so I decided to send them to my home computer which was already set up as an FTP server with dynamic DNS service from no-ip.org. I didn’t want to run into the same lack of hard drive space issues at home, so I decided that only one backup per week should be sent to my home computer.

To do this I decided to run a check every day for backup items more than 7 days old. If a file met that criteria, I then checked to see if it had been created on a Monday (chosen arbitrarily, use whatever day you want), and if it was I initiated an FTP transfer to move it to my home computer. Once that check/transfer was complete, the file would be removed from my hosting server to save space.

The Setup

This operation is significantly more complex than the backups, and requires preparation items before it can really get started. We need to first get the current date in the form of a Unix timestamp in order to compare it to the backup files to determine their age. Then we need to get the collection of backup file names as an array by using scandir() on the directory that contains them. Finally we prepare the information and credentials for the FTP server we’ll be connecting to.

The Execution

Now we’ll loop through each file name in the $files array and perform the required checks. The first check is to make sure the file we’re dealing with is actually a file and not a directory. This is extremely important in a Unix like environment because even a directory that contains only files will also contain two hidden items, named . and .. which are representations of the current directory and its parent directory respectively.

Once inside the loop, we use the filemtime() function to determine the last modified time of each file and store it as a Unix timestamp. We can then use that timestamp to determine the day of the week on which it was saved, and also its age relative to the timestamp we created at the beginning of the setup. If the file is older that one week (or 604,800 seconds) we move into the next stage of the operation.

Now all that’s left to do is determine whether or not the file was created on a day that means it should be saved (in this case a Monday), and if so we use PHP’s FTP API to send the file to our FTP server. Once that check has been completed, and the file has been sent if necessary, we use the Unix rm to delete the file from the local server.

Now save this as a PHP file and run it with a cron job just like how the backup script was run. These two scripts in conjunction should create a fully automated, self-cleaning backup system synced across two different machines.

Tagged with: , ,
Posted in MySQL, PHP, Server Administration, Server Side Code

Leave a Reply

Your email address will not be published. Required fields are marked *

*