Backups by crossover between network centres - setting up automatic scp transfers
Our web servers don't need transaction logging backup systems - emails and secure pages are looked after (and spam trapped) by a separate machine from our two dedicated web servers, which handle a great deal of requests for data reads, but data writes aren't all that many. But of course the data writes / changes / blog articles / forum contribution / calendar changes are happening all the time, and there's a need to be able to restore the machines if something goes dramatically wrong. The question that need to be asked in setting up the backup strategy is "how do we get this back / how much do we loose if the nastiest thing goes wrong at the worst moment?"
Firstly, both machines run regular and quite frequent backups using crontab
jobs - with the backups being stored on the same machine, so that we can pull back any data that we need to. Most of these are stamped with the time in the week at which they're taken, so we can step back up to 7 days. These backups have proven very useful when members of forums have done something very silly (like decided to leave in a huff and delete all their posts) ... and we've restored the posts - and thus the integrity of the threads - very quickly indeed. Since of backups are at an SQL and file level, such a restoration doesn't even mean any loss of recent data since the last backup was taken.
But - and this happened a couple of weeks ago - what happens if we have a hard disc failure? "Sorry - there's no point in trying to mount the old disc - there's nothing left on it that can be read...". That's were offsite backups come in and, until not, we've taken such network copies about once a week, when we remember. OK - that's not been very clever, and it takes time - so I've now put into place a data swap scheme between our two machines - on in Germany and the other in England. The question arose how to transfer the backup files cleanly, automatically and securely, and Ive set up a crontab
job, twice a week each way, using scp
in batch mode. And in order to do that, I had to set up public and private keys between the backup
accounts on the two machines. The setup only needs to be done in one direction, as the authorised client scp
can both push and pull files. Here are the details ... with keys intentionally changed so that no-one reading this blog can get in!
On the WELL HOUSE MANOR machine - which will be the client that runs the scp command - in Germany
-bash-3.14b$ ssh-keygen -t dsa
On the WELL HOUSE CONSULTANTS machine - which will be the server - in England
Generating public/private dsa key pair.
Enter file in which to save the key (/home/backup/.ssh/id_dsa): my_client_key
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in my_client_key.
Your public key has been saved in my_client_key.pub.
The key fingerprint is:
-bash-3.14b$ scp my_client_key.pub firstname.lastname@example.org:
my_client_key.pub 100% |**************************************| 623 00:00
-bash-4.1$ cat my_client_key.pub >> ~/.ssh/authorized_keys
On the client machine
-bash-4.1$ chmod 600 !$
chmod 600 ~/.ssh/authorized_keys
-bash-4.1$ rm my_client_key.pub
in Germany - copy command in the crontab file for user "backup"
45 7 * * 3 scp -B -r -i /home/backup/my_client_key email@example.com:Wed /home/backup/remote
45 8 * * 3 scp -B -r -i /home/backup/my_client_key /home/backup/local firstname.lastname@example.org:remote
repeated TWICE a week, different sources and targets, in case a system fails during transfer
I'm still going to be downloading the occasional full backup to our own HQ and hotel, and also to that mail and secure server I mentioned (which are in California). However, if the world's hit by a disaster big enough to wipe of data centres in London, Koblenz and Fremont, I expect that Lisa and wouldn't be doing very much IT training thereafter! (written 2013-04-13, updated 2013-04-20)
Associated topics are indexed as below, or enter http://melksh.am/nnnn for individual articlesA162 - Web Application Deployment - Backups and File System Management 
Extracting data from backups to restore selected rows from MySQL tables - (2015-05-01) 
Backup procedures - via backup server - (2015-01-24) 
Commenting out an echo killed my bash backup script - (2015-01-19) 
Checking MySQL database backups have worked (not failed) - (2015-01-10) 
More or less back - what happened to our server the other day - (2013-06-14) 
An overpractical test of our backup strategy! - (2013-03-30) 
How much space does my directory take - Linux - (2009-07-20) 
Some Linux and Unix tips - (2008-11-18) 
Will your backups work if you have to restore them? - (2008-09-18) 
Dialects of English and Unix - (2008-08-21) 
The tourists guide to Linux - (2008-05-20) 
Linux / Unix - layout of operating system files - (2007-11-20) 
Linux run states, shell special commands, and directory structures - (2007-08-03) 
Finding public writeable things on your linux file system - (2007-01-06) 
Copy multiple files - confusing error message from cp - (2006-12-30) 
tar, jar, war, ear, sar files - (2006-06-10) 
Boys will be boys, saved by Ubuntu - (2006-05-27) 
Copying files and preserving ownership - (2006-04-28) 
Finding where the disc space has gone - (2006-02-06) 
What backup is adequate? - (2006-01-04) 
Symbolic links and hard links - (2005-06-02) 
Linux - where to put swap space - (2004-12-16)
Some other Articles
The woman, the television, the bullock and DarlingtonMVC and Frameworks - a lesson from first principles in PHPHandling requests to a forum - the background processApache httpd - a robust, open source web serverBackups by crossover between network centres - setting up automatic scp transfersSessions, forms and validation in CodeIgniter - early examplesSeamless, integrated IT - we have a long way to go!CodeIgniter - an excellent PHP framework with an easy start pointCurl and curling from PHPThe highs and lows of customer service - Cheltenham