HOW I MAKE BACKUPS ON UNIX
If you are a system administrator, one of the most crucial aspects of your job are backups. And it's not enough to just create them, you also have to store them properly and test their effectiveness, i.e. if the recovery process can be completed without any issue. Furthermore, you also have to consider the scalability factor of your disaster recovery plan, i.e. how to back up your whole infrastructure when it will grow in size.
There are many ways to accomplish this. A backup can be anything from an external hard drive where you manually copy files one-by-one to a fully-automated orchestration service that provide you disaster recovery services through cloud computing(DRaaS).
RAID is not a backup!
There are also people who consider snapshots, volume management technologies(such as LVM or ZFS) or data mirroring(such as RAID) as reliable backup solutions. Let's be clear about this: they are not. RAID cannot protect you against human error(i.e., against accidental files deletion) or from viruses, in fact it will efficiently propagate the file changes across the disks of the array. RAID is good for hardware failures, i.e. when your disk break down. But, if you want to be able to recover a file that has been accidentally deleted, you need a backup.
How to make a backup
A proper backup plan should consider, at least, the following three aspects:
- The 3-2-1 strategy;
- Off-SAN copies;
- Periodically recovery testing.
How I make backups
On UNIX/Windows there are many different programs to make a backup, both free and paid. In this article, however, I want to show my solution to server and workstations backups, but before giving you the name of the software I made, I want to digress a little about the requirements I had for my backup process.
My machines are all small-scale virtual servers(VPS), I don't have a large infrastructure with TBs of data. Therefore, the backup solution I was looking for was something that could back up single directories instead of entire operating systems. I was looking for something very small, UNIX-compatible, that would allow me to perform backups using a cronjob. Furthermore, I needed something that would encrypt the final backup using a secure encryption algorithm(like AES).
The first prototype of this backup software was a simple shell script where I would copy files using a combination of
gpg(1). While this script would actually work, it was hardly scalable: every time I wanted to add a new directory to the backup job, I had to manually modify the source code of the script. It was very inconvenient to use. So after that I decided to write a real backup script where I could dynamically specify the paths to back up using a configuration file. That's how
backup.sh is a POSIX compliant, modular and lightweight backup utility to save and encrypt your data. It is intended to be used on relatively small UNIX systems, such as VPS, personal servers or workstation, to back up single directories.
backup.shuses a combination of
gpg(1)to copy, compress and encrypt your data.
This backup utility works under pretty much any UNIX operating system. I personally use it on both GNU/Linux and FreeBSD servers, but I also have a couple of people who use it on macOS without any issue.
To define the backup sources,
backup.shuses an associative array defined inside a text file of your choice. The syntax to specify a new backup entry is the following:
<LABEL>is the name of the backup and
<PATH>is its path. Hence, if you want to back up your nginx and ssh config files, you should create the following entries inside the configuration file:
backup.shwill create two folders inside the backup archive with the following names:
After that, you can start the backup using the following command:
$> sudo ./backup.sh --backup <SOURCES_FILE> <DEST> <ENCRYPTION_PASSWORD>
<SOURCES_FILE>is the sources file,
<DEST>is the absolute path of the output of the backup without trailing slashes and
<ENCRYPTION_PASSWORD>is the password to encrypt the compressed archive.
$> sudo ./backup.sh --backup sources.bk /home/marco "qwerty1234"
backup.shwill begin to copy the files defined in the sources file:
This will create a new encrypted backup in
Copying nginx(1/2) Copying ssh(2/2) Compressing backup... Encrypting backup... File name: /home/marco/backup-<HOSTNAME>-<YYYYMMDD>.tar.gz.enc File size: 7336400696(6.9G) File hash: 0e75ca393117f389d9e8edfea7106d98 Elapsed time: 259 seconds.
backup.shcan also be used to extract the content of an encrypted backup. To do this, you can use the
--extractoption with the following syntax:
$> ./backup.sh --extract <ENCRYPTED_ARCHIVE> <ARCHIVE_PASSWORD>
this command will create a new folder called
$> ./backup.sh --extract backup-<hostname>-<YYYMMDD>.tar.gz.enc "qwerty1234"
backup.sh.tmpin your current directory. Inside that, you will find the following folders:
As you can see
backup.shis not difficult to use. It is very lightweight, and it can be helpful for all those backup strategies where you just want to back up individual files/directories. If this brief introduction has piqued your curiosity, and you would like to know more about this project, feel free to go to this repository.