I recently had a Vanilla user report that their server was compromised and all of their discussions and comments were lost when the hacker got into their MySQL databases. Luckily Vanilla was not the the source of the exploit that gave the hacker access to the server (*sigh of relief*), but it got me thinking about how people should back up their databases regularly. It is an unpleasant job if you have to do it manually, so a few years ago I wrote a couple of scripts that do the job for me, and I figured I should share.
Here’s how the process works on best open source forum:
- Mysqldump the databases you want backed up
- Compress the backup files so they are smaller for transporting off the server
- Download the files to a remote location
- Remove the files from the server so they don’t take up space.
Now, you may not need to follow the last two steps. If you already have a service on your hosting provider that backs up particular folders, you can just keep mysqldumping your database files through a cron job and then tell your host to back up the folder that they are sitting in. When I wrote my scripts, I didn’t have a host that does this. Now that I do, I still use my script to download copies of my backups because I’m just that paranoid.
Steps 1, 2, and 4 will be performed through a cron job on the server (I’m on a Linux server). Step 3 will be performed by a .bat file on my local computer (run through Window’s Scheduled Tasks). If you don’t run a Windows machine at home, that’s fine. You can do step 3 quite easily on an Apple by using cron and a shell script similar to the one used for steps 1, 2 and 4.
Here is my shell script that sits on the server and is executed by cron every 24 hours:
#!/bin/sh
# Define a variable to hold the current date in a string (this will be appended to file names)
DATE=`date -u +%F`# First get rid of any existing backups
rm /absolute/path/to/the/dump/folder/*.sql.gz# Now back up the database(s)
mysqldump -ce –user=your_db_user –password=your_password db_name > “/absolute/path/to/the/dump/folder/db_name-${DATE}.sql”# GZip the database(s)
gzip /absolute/path/to/the/dump/folder/*.sql
I personally like to have things all contained in one place. So, I created a file called mysql_backup (no file extension; ain’t Linux fun?) with those contents and placed it in the folder where I wanted my database backups dumped. Then I made my file executable. To be perfectly clear, my file is executable, owned by root, and is in the root group. Here is how the file is represented when I run “ls -l” in my dump folder:
-rwxr-xr-x 1 root root 1389 Mar 31 10:35 mysql_backup
Next I needed to make cron aware of my shell script and make sure that it is run once a day. I did this by creating another shell script (again, owned by root and executable) and placing it in the appropriate cron folder. In my case, I put it here: /etc/cron.daily/mysql_dump
Here is what my cron shell script contained:
#!/bin/sh
/absolute/path/to/the/dump/folder/mysql_dump
OK, so now I’ve got a cron script that backs up my database(s) every 24 hours and erases any old ones from the day before. But how do I get those files down to my local machine? After all, this is pretty useless unless the files are copied somewhere *other* than the server.
I use Windows primarily in my day-to-day tasks because most of my client-based work is done with Microsoft languages. So, I needed to have a quick and easy way to connect to my server (Which does not allow ftp connections) and download the files. I can only connect to my server using SSH, so I found a neat little tool called psftp.exe which you can read about here and download here. It is a tiny application made by the guys that brought you Putty (which you are most likely familiar with if you’ve ever had to connect to a Linux server from a Windows machine) that allows you to run simple commands on your server from a windows command line. There is plenty of documentation out there for it, and it’s all very un-css’d and difficult to follow. Setting this up took a lot of trial and error for me, so hopefully you won’t have to deal with any of that if you just copy what I did.
PSFTP works really well if you create a file with all of the commands you want to run. For my example, I created a file called download.scp and placed it in the folder on my Windows machine where I wanted my backups to download to. For my example, I put all backup related files (including the backups) in: C:/Backup/Lussumo Server/
My C:/Backup/Lussumo Server/download.scp file contained this:
mget /absolute/path/to/the/dump/folder/*.sql.gz
quit
mget is a command that tells psftp that we want to grab and download multiple files. I specify the absolute path where I want the files to come from and use a wildcard in the filename to make sure that I get all backed up db files. Then I issue a quit command so that psftp.exe knows to disconnect from the server and shut itself down when it’s finished. So now that psftp knows what to do once it’s connected to the server, I need to make a batch file that runs psftp, connects it to the server, and calls my download.scp file.
I created another new file called C:/Backup/Lussumo Server/backup.bat
In case you didn’t know, files in Windows that end with .bat are called “batch files” and they are basically a sequence of command-line statements that are to be run when the file is executed. So, if you had a complex command-line statement that you didn’t want to write out again and again, you could just put it into a batch file and then double-click the file whenever you wanted to perform that action. In this case, my batch file contained:
psftp -pw my_password my_username@my_domain.com -b “download.scp” -P 22
This command executes the psftp.exe application, supplies my password, my username, and the domain of the server I want to connect to, it specifies that I want to run the commands contained in download.scp once I’m connected, and it wants to perform all of these actions through port 22 (an SSH port).
As of now I can test my work by (a) connecting to my server and running my /absolute/path/to/the/dump/folder/mysql_dump shell script to make sure that the gzipped db files get created. Then I can double-click my batch file in Windows to make sure that it connects and downloads the backup files to my local computer.
The final step is (in Windows) to go to Start > Control Panel > Scheduled Tasks. Double-Click “Add a Scheduled Task”. Follow the prompts of the wizard. When it asks you what program you want to run, click Browse and find your backup.bat file. Then follow the rest of the Wizard to define how often you want the batch file to run. For my example, I have it run every day at 3am.
That’s it. It’s a VERY low-tech solution to backing up your server’s databases. The only thing you have to watch out for now is that your local backup folder doesn’t get too big. I tend to forget this thing is even running, and slowly but surely my computer fills up with backup files.