I should note that our home network is a mixed network, with the data
being saved to Linux, but also being used by several Windows Computers. The Windows Computers are currently not being backed up, but the network drives that they use is. Backups of our computers at home have
always been a little bit haphazard. I have tried to get periodic
backups done, but it has always been such a bother. It's probably
more important than we like to admit. And since I host the network
drives on my desktop, it is doubly important. It's not especially
complicated.
- Run the backup scripts in cron. That is easy. For my home network, monthly should be quite adequate.
- Copy the backups to external media.
Remembering to copy the backups to
external media has always been the issue.
At first, the media was cd-rom, then
dvd-rom. When memory sticks started to climb in capacity and their
prices started dropping, I started to use them. But now with terabyte
hard disks relativly cheap, I use them. At best, I was probably
doing quarterly backups. Also, I have found that the USB hard disks
are not always the most reliable media. I have had several of them
go bad.
One thing I have always done is make
sure I have a good backup before doing any system upgrades, such as
upgrading one of my Linux Distros.
With the Rasberry Pi being cheap, low
power and readily available, I am starting to use it as a backup
server. This way, backups will occur if I remember it or not.
I have set up four separate scripts in
Cron:
- Dump MySQL Databases.
- Run Dumps of home Directories.
- Move the dumps to my home account.
- Use ssh to transfer the Home Account Dumps to the Rasberry Pi.
The MySQL script is just something I
found on the internet:
#!/bin/bash
# sonia
16-nov-05
# backup
each mysql db into a different file, rather than one big file
# as
with --all-databases - will make restores easier
set
-x
echo
'/**************************************************/'
echo
'/**Job Backs up the MySQL Databases to flat **/'
echo
'/** Files **/'
echo
'/**************************************************/'
date
USER="root"
PASSWORD="MyPass"
OUTPUTDIR="/home/myacct/Documents/MysqlBackups"
MYSQLDUMP="/usr/bin/mysqldump"
MYSQL="/usr/bin/mysql"
# clean
up any old backups - save space
rm
"$OUTPUTDIR/*bak" > /dev/null 2>&1
# get
a list of databases
databases=`$MYSQL
--user=$USER --password=$PASSWORD \
-e
"SHOW DATABASES;" | tr -d "| " | grep -v
Database`
# dump
each database in turn
for
db in $databases; do
echo
$db
$MYSQLDUMP
--force --opt --user=$USER --password=$PASSWORD \
--databases
$db > "$OUTPUTDIR/$db.bak"
done
date
I added the two date commands, just so
I could see how long it runs. I also added rm commands to delete my
old backup files to this script. (I have removed those lines from
this example for security reasons.) I scheduled this to run at 00:10
on Sundays and it runs less than a minute. There isn't a lot of data
out there, but I do rely on MySQL a lot. I would hate to lose my
data.
At 00:15 on Sunday Morning, the backups
run and the script looks like this:
#!/bin/bash
set
-x
date
echo
'/**************************************************/'
echo
'/**Job Starts Full Backup of home directories
**/'
echo
'/**************************************************/'
date=$(date
+'%Y-%m-%d %H:%M:%S')
read
Y M D h m s <<< ${date//[-: ]/ }
if
[ $D -lt 8 ] ;then
echo
"Run Full Backups"
// who
-u | grep terry awk '{printf("kill -9 %d", $2)}' | sh
// ping
1.1.1.1 -n 1 -w 5000 > NUL
at
-f /home/myacct/Documents/bash/User1Bkp now
at
-f /home/myacct/Documents/bash/User2Bkp now
at
-f /home/myacct/Documents/bash/User3Bkp now
at
-f /home/myacct/Documents/bash/User4Bkp now
at
-f /home/myacct/Documents/bash/User5Bkp now
fi
date
The If Statement checks the day of the
month, and if it is less then 8, starts the backups. There is no
need to run full backups every week for my home network. The data
just doesn't change that fast. Each user's backup is a separate
script and is started via the At command, this way they can run
concurrently with each other rather than in sequence. Each User
Backup Script will be similar to the following:
#!/bin/bash
echo
'********************************'
echo
'** Backup of User1 Acct **'
echo
'********************************'
date
tar
-czvf /home/user1bkp /home/user1
date
It is just a basic tar script to back
up that specific user home account, which is also their network drive
via Samba. I have added date commands at the beginning and end of
the script, so I can tell how long they are running. If I start the
transfer too soon, the backups would be useless. The output from the
scripts are going to be huge and all we really have to look at is
begin and end run times. Since I am running this from my Root Cron
Account, the output will be sent to the Root Linux/Unix Mail Account
and can be read using the mail program. I found though, that reading
it with Mutt is even better, since by pressing the '|' key sends the
mail message to a shell and then all that would have to be typed in
is:
grep
'Sun '
and then something like the following
should be listed:
Sun
Jan 6 00:15:00 CST 2013
Sun
Jan 6 01:58:53 CST 2013
So, in this case the job ran about 1.75
hours. I checked the run time on all of the
backups and adjusted the start time of the move and copy scripts
respectivly. Currently all of my backups complete before 2:30
am.
The third script runs at 3:10 am and it
only runs a few seconds:
#!/bin/bash
echo
'********************************'
echo
'** Copies backup files to **'
echo
'** my home account **'
echo
'** and makes me the owner **'
echo
'********************************'
date
mv
/home/*bkp /home/user1
chown
user1 /home/user1/*bkp
date
Finally, the last script runs at 3:20.
For this to work, I first had to install the “sshpass” program,
which is not included in the ssh utility. It's probably not the most
secure. But since this is running on a home network and with only
family members having access, I am not too concerned. If this was
for a business, I might look for something more secure. I considered
installing a ftp server on the Rasberry Pi, but then I found that
ssh's Secure Copy was already built in. The -arcfour gives the
lowest level of encryption that ssh allows. Again, I don't feel
security is a big issue on home networks and if I could have turned
encryption off, I would have.
The let
"Remainder=$M%3" does a Modulus divide,
setting the $Remainder variable to the Remainder of the Current
Months number divided by 3. This way, each month will go into a
separate directory for 3 months, and if I need to go back to a
previous months backup, I can. This job will run for hours, but
speed shouldn't be an issue.
#!/bin/bash
set
-x
echo
'/**********************************************************/'
echo
'/**Job Copies Backup Files to my Rasberry PI at: **/'
echo
'/** 192.168.0.50
**/'
echo
'/**********************************************************/'
date
date=$(date
+'%Y-%m-%d %H:%M:%S')
read
Y M D h m s <<< ${date//[-: ]/ }
if
[ $D -lt 8 ] ;then
let
"Remainder=$M%3"
if
[ $Remainder = 0 ] ;then
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user1bkp
pi@192.168.0.50:/media/usbhdd/Month1
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user2bkp
pi@192.168.0.50:/media/usbhdd/Month1
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user3bkp
pi@192.168.0.50:/media/usbhdd/Month1
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user4bkp
pi@192.168.0.50:/media/usbhdd/Month1
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user5bkp
pi@192.168.0.50:/media/usbhdd/Month1
fi
if
[ $Remainder = 1 ] ;then
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user1bkp
pi@192.168.0.50:/media/usbhdd/Month2
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user2bkp
pi@192.168.0.50:/media/usbhdd/Month2
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user3bkp
pi@192.168.0.50:/media/usbhdd/Month2
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user4bkp
pi@192.168.0.50:/media/usbhdd/Month2
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user5bkp
pi@192.168.0.50:/media/usbhdd/Month2
fi
if
[ $Remainder = 2 ] ;then
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user1bkp
pi@192.168.0.50:/media/usbhdd/Month3
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user2bkp
pi@192.168.0.50:/media/usbhdd/Month3
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user3bkp
pi@192.168.0.50:/media/usbhdd/Month3
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user4bkp
pi@192.168.0.50:/media/usbhdd/Month3
/usr/bin/sshpass
-p 'MyPass' /usr/bin/scp -c arcfour /home/user1/user5bkp
pi@192.168.0.50:/media/usbhdd/Month3
fi
fi
date
Lastly, the crontab is kind of goofy.
I have to look up the format every time I use it. You can read about Cron here. The easiest thing
to do is put the runtimes for the root account and the MyUser Account
in separate files. The MyUser Cron file is set up as follows:
10 00 * * 0
/home/MyUser/Documents/bash/MysqlBackup
20 03 * * 0
/home/MyUser/Documents/bash/CopyBkp
And then the Root Cron File is set up
as:
15 00 * * 0
/home/MyUser/Documents/bash/FullBackUp
10 03 * * 0
/home/MyUser/Documents/bash/RtCmd
In these, the first column is the
minute to start, the second column is the hour it is to start and the
fifth column is the day of the week.
Then, to change the current crontab
with these, just sign into that user command line and type:
crontab FileName
where FileName is the name of
that user's crontab file.
No comments:
Post a Comment