The Proverbial Second Chance – Part Deux: Backing Up Your Data On A *NIX System
The Proverbial Second Chance – Part Deux:
Backing Up Your Data On A *NIX System
In our previous technical blog The Proverbial Second Chance (LINK HERE), we covered backing up a Windows Server using the built in Windows Server Backup utility and mentioned that the following month we will look at built in backup solutions for *Nix systems using built in commands. I always try and keep my promises., and since it is time to perform a back up my blog anyway. Let’s get started! I have broken this blog down into 3 parts. Part 1 will cover backing up your web application, but not the database. Part II will cover backing up the database using phpMyAdmin, and part III will cover backing up the filesystem (excluding the web application and database.)
Part I would be useful for those that use a web hosting service, or are migrating a self hosted web/database application to a new server. Depending on your service provider you may be able to host whatever web application you please, but you don’t actually have access to install executables (your own backup software.). On the topic of hosting providers and backups. While it may be part of your service that your host will perform backups, pay close attention to if and how often this is actually being done, what is actually being backed up (Is it a only a file level backup, or is the database included as well). How much data do they agree to back up (Are you over your limit?). How do they ensure your data is backed up, and most importantly, if you have an e-commerce site Is your customers personal and transaction data secure? Is it being encrypted, and how is it stored? Don’t wait until something has happened to your site. Ask early, and confirm often! Do not feel guilty either! Make sure they are actually providing the services they are promising.
So now that we have gotten the pre-ramble out of the way, let’s get into the nuts and bolts of things. The first and easiest option is to simply compress the files we want to backup and download them. For this we need access to the service with a shell level access, and sFTP.
To outline what we will be doing in part I
- Archiving and compressing our Web Application directory (WordPress and phpBB) using tar with the gzip option.)
- Copy the file to a thumb drive on my local computer.
- Dump a copy of the MariaDB databases using phpMyAdmin to my local computer and manually copy the file to the external thumb drive.
- Delete the archive on the remote web server.
As you can see this is all manual at this point, in a future blog we will take a look at writing a BASH script that will run monthly through a chron job to automate this process.
PART I – Archiving your application directories.
The first thing we need to do is archive the directories so that we may download them locally. We will use the tar command with the gzip option. So this will archive the entire directory structure, then compress the archive using GZIP.
First let’s go to the root of our web application directory. On my server this is /var/www/html
So I will ssh into the web server, and at the shell prompt type the following
$ cd /var/www/html
Under the web root, my web applications are in a subdirectory called hendrb, so I want to create a tar archive of hendrb with all files and sub directories and I want to compress them into a .gz archive and place this file in an sftp directory that already exists in my home directory. Even though technically you do NOT need to stop the web server it is a good idea. This will prevent files from not being copied because they are open for reading. See the documentation for your specific OS for stopping the web server, or preventing connections to your website.
To TAR the directories, We will enter the following command.
$ tar -zcvf ~/sftp/crweb_backup0519.tar.gz hendrb
Let’s break down the command really quickly.
Tar, which originally meant “‘T’ape ARchive” takes many files and puts them together in a single file (So they may be written to tape.), or saved to disk. The options we selected -zcvf
- z (runs file through gzip after it is created, for compression.)
- c (creates archive)
- v verbose (displays files being archived)
- f file=ARCHIVE (Use arvhice file or device ARCHIVE)
~/sftp/crweb_backup0519.tar.gz is the full path of the archive we are creating, otherwise the archive ‘crweb_backup0519/tar/gz’ would be created in the directory the tar command is being executed from, hendrb is the name of the directory we want to copy (the location of our web applications.) Copying to the same directory as your web application may be fine depending on how your hosting is setup.
So now is we do a long listing of the sftp dirtory in our home directory, we should see the compressed archive we just created.
$ ll -h ~/sftp
total 3.4G
-rw-r--r-- 1 root root 3.4G May 20 03:40 crweb_backup0519.tar.gz
There it is. A nice 3.6G file, now wait just one minute! You said this was going to be compressed, why the heck is the file so big?! Well, most of the files in this archive are JPEGS, which are already compressed images, so you can’t really compress them anymore. In fact, there are times when a compressed file may actually become larger when attempting to run through another compression application. Such is life and there really is not anything you can do about this.
Part II – Transferring the backup to your local computer using sFTP
So now that we have the backup of our web application files, we need to transfer them to our local computer. To do this we will use an sFTP client. For OS X I am using Fetch for a GUI sftp client. For Windows you can download and use Filezilla.
Web hosting system are setup differently, so I have no idea what your default directory will be. Connecting to my server I am placed in the root of my home directory. You can see the sftp directory that I have created and if I double click the directory I can see the backup that I created.
I can now just drag this this file to my desktop or directly to the USB drive or network share of my choice.
Once the copy is complete, I can go back to my web server and delete the backup.
I will change directories to my home/sftp directory, and do a short listing to make sure I am in the correct directory, and the file is there.
$ cd ~/sftp
$ ls
crweb_backup0519.tar.gz
Which we can see it is.
I will then sudo to delete the file (Since the archive was created using SUDO, it will belong to root, which means I will need to execute the rm command with sudo even though it is in a directory I have full privileges to.)
I simply enter the following command.
$ sudo rm crweb_backup0519.tar.gz
We are now done with backing up the file structure of our websites, however there is still one piece missing. The actual database that contain the bread and butter of both of our web applications. Without them, the physical files are useless! In the next section we will go over how to backup the database to our local computer easily using phpMyAdmin.
Part III – Backing Up Your Database Using phpMyAdmin.
The final step in this process is to back up your database, most hosted sites use php MyAdmin to allow you database administration. If you self-host, this is a utility you will definitely want to install. If your website is hosted, refer to yours hosts instructions on accessing phpMyAdmin. If you self host, go to the url and port you have it configured. At the login prompt login with the root or admin account you have configured for the database.
Once you successfully login, you will see this screen.
Click on the Export Tab
Select the ‘Custom – display all possible options button
You can choose to export all your databases into a single file, or back them up one at a time. Since I backed up my entire web application directory in the previous step I will leave both databases selected.
Under Output:
Save output to a file
Under Format:
Make sure SQL is selected
Under Add statements:
Make sure ‘Add DROP TABLE / VIEW / PROVEDURE / FUNCTION / EVENT / TRIGGER statement is selected.
Under Data creation options
Leave as is and click the ‘Go’ button.
This will download an export of your database(s) to your local computer, copy this file to the same device your file backup was saved to. In This case localhost.sql. You can rename this to whatever you want at this point.
If your site is hosted, you have now backed up both the file structure of your website, and the SQL database, and are done at this point. If you self-host, and do not backup your operating system, you may want to read part IV, backing up your Linux OS from the command line, unless you have no problem with rebuilding your Linux system from scratch in the event of a catastrophic failure.
Part IV – Backing up your Linux Operating System manually from the command line, excluding your web application directory and database.
In the previous 3 parts, we looked at backing up your web application directory and database using the build in tar (Tape Archive) command, and phpMyAdmin, both utilities you should have access to on a hosted or self hosted website. If you self host there is a third piece to this trinity that is still not 447backed up, unless you use other methods your operating system is still unprotected. For this demonstration we will use the built in rsync (Remote synch) command while excluding the web application directory, and the SQL databases. Since we have already backed them up using other methods.
First we will create a directory in the root of the filesystem called bckup, and exlude it from the backup as well. This is IMPORTANT! Failure to exclude the destination directory from the command will cause an infinite loop in the backup. Once the backup is complete we will once again transfer the file to a thumb drive on our workstation for permeant storage.
Step 1 – Create the /bckup directory
At your shell prompt type –
$ sudo mkdir /bckup
Now let’s change the permissions so only the owner (root) has read, write, and execute permissions. We will want to change the owner of the file to the account you will be using to transfer it via sftp after the backup is complete, but for now.
$ sudo chmod 700 /bckup
Let’s verify the directory and permissions is correct.
$ ll /bckup
It looks good to me! So now let’s backup our OS!
For the exclusions, we already know where the web application resides (/var/www), however we do not know where MariaDB is keeping the database files. To locate these we must look in the /etc/my.cnf file and look for the datadir variable. We can do this using the grep command.
At your shell prompt
$ grep datadir /etc/my.cnf
You can see from the output that, our databases are stored in /var/lib/mysql. Make a note of this!
Here is a list of the directories we want to exclude
- /bckup (PLEASE DO NOT FORGET THIS ONE!
- /dev/* (This excludes anything in the /dev directory (Raw device files)
- /proc (These are your system processes)
- /sys/
- /tmp/
- /mnt
- /media
- /lost+found
- /var/lib/mysql
- /var/www/
The command will look like this.
I would perform a ‘dry run’ using the --dry-run option to make sure we have enough free space left
I ended up having to type the entire command out, for some reason it was not working when I built the command in a text editor and copied and pasted it into the shell.
I would highly recommend that you keep an eye ton the backup he first time the backup executes, to make sure your exclusions are working properly. You may also find that you can exclude more directories your next run.
After the backup completes, you may want to tar and compress the files for easy transfer and storage. Depending on where you are keeping you backup files.
You can do this by using sudo tar -zpcvf ~\sftp\linuxbackup0519.tar.gz \bckup
The -p option is important in the command above as it preserves file permissions. When you uncompress the archive you want to be sure to use the --same-owner flag.
As mentioned above, since sudo executes the command as root, we now need to change the owner to your user account in order to retrieve the files.
Now simply use your sftp client to download your OS backup to your backup device.
You now have a 3 backups of your Linux Server, a file backup of web applications, a dump of your database, and backup of your linux OS. Let’s quickly summarize what we did before wrapping up this blog.
In Part I we created an archive of our Web Application directories, these contain the file structure and php files needed to run our web applications, it also contains the physical content that makes up our web applications (media files for the blog, etc.). This, along with the database dump can be used to restore the web applications in the case of directory corruption, accidental deletion, failed application upgrade, or migration to another site if you are hosting externally.
In Part II – we covered how to retrieve our backups using an sftp client such as Fetch for Mac OS X, or Filezilla for Windows.
In Part III we covered backing up the MySQL or MariaDB database using phpMyAdmin.
In Part IV we covered backing up the Linux OS that our applications and database run on, for those of us that selfhost our sites.
I highly recommend you test the above steps on a test server, especially if you are using them for a web server in a corporate environment. Myself nor anyone connected to Brent’s World will not be responsible for the loss of data from performing the steps outlined in this blog. There are much better ways of backing up a NIX server than outlined here, these instructions are meant for a quick and dirty backup method if you do not already have another solution in place.
Thank you for visiting Brent’s World! Please come back next month for more exciting technical content! If you wish to be notified when new content is posted, please consider registering by clicking here!
Comments
The Proverbial Second Chance – Part Deux: Backing Up Your Data On A *NIX System — No Comments
HTML tags allowed in your comment: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>