Automate backups from Linux servers to Amazon cloud

Automate backups from Linux servers to Amazon cloud

The aim of this article is to show how I successfully managed to automate the process of backing up a web hosting server host by host including Mysql data. This was achieved by compressing and protecting the archive with high level of encryption before transferring it to the Amazon . In order to fully understand the script you will need to have basic knowledge of Linux commands which as well the installation of s3cmd tool and 7zip. My experiment was on Linux CentOS 6.5 and I had already obtained an Amazon s3 account so make sure you get one.
 
A good question was raised to me on google+ discussion board is it safe to upload your backups to the cloud ? My answer is yes as long you know what you are doing I took the risk after ensuring that provides server side . Further more to enforce TNO concept if you check my script you will see 2 levels of different encryption methods were performed before the upload phase. The first was achieved using AES-256bit via 7zip the second was via GnuPG with CAST5 algorithm. By this I believe that there is no way no one can look at the backup content unless they can guess my 128 gibberish random passwords on the 3 different methods of encryption.

 

Installation:

 

First you will need to install the s3cmd tool:
# You will be asked to accept a new GPG key – answer yes (perhaps twice).

cd  /etc/yum.repos.d
wget http://s3tools.org/repo/RHEL_6/s3tools.repo
yum install s3cmd 

To configure s3cmd:
# Make sure you set https on with gpg enabled

s3cmd --configure

Run s3cmd ls to list all your buckets:

s3cmd ls

Make a bucket with s3cmd mb s3://my-new-bucket-name:

s3cmd mb s3://digi77backups

List the contents of the bucket:

s3cmd ls s3://digi77backups

Upload a file into the bucket (plain):

s3cmd put chatcleaner s3://digi77backups/chatcleaner

Upload a file into the bucket (PGP encryption):

s3cmd -e put chatcleaner s3://digi77backups/chatcleaner

Retrieve the file back and verify that it hasn’t been corrupted:

s3cmd get s3://digi77backups/chatcleaner chatcleaner-2 
md5sum chatcleaner chatcleaner-2

Clean up: delete the object and remove the bucket:

s3cmd del s3://digi77backups/chatcleaner s3://digi77backups/chatcleaner2 s3://digi77backups/chatcleaner3
s3cmd rb s3://digi77backups

For more commands check the following links 1 2.
Now that you have learned how to control your bucket via your server let us move to the next step.

Install 7zip tool:

rpm --import http://apt.sw.be/RPM-GPG-KEY.dag.txt
Check machine architecture: uname -i
wget http://pkgs.repoforge.org/rpmforge-release/rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
rpm -K rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
rpm -i rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
yum -y install p7zip

To create an encrypted zip file with 7z archive:

7za a -tzip -pMY_SECRET -mem=AES256 secure.zip doc.pdf doc2.pdf doc3.pdf 

To extract a password protected file:

7za x thefile.zip -pThePassword

 

The script:

 

I will be explaining the script through the lines with the comment sybmol #

#!/bin/sh
# Shell script written by W. Al Maawali    
# (c) 2014 Founder of Eagle Eye Digital Solutions
# http://www.digi77.com
# http://www.om77.net
# script starts here:

# Script timer variable
start=$(date +%s.%N)

# get the current time and date
fulldate="`date`"

# Set the backup date in Year - Month - Day format
backupdate="`date +%y%m%d`"

echo "Go to /home/backup and run rm -f * " $fulldate
cd /home/backup

# Remove any left over from previous backups
rm -f *.tar
rm -f *.zip

# Backup all 3 hosts carssite - busisite - personalsite : Those names are just examples
cd /home/backup

fulldate="`date`"
domanvalue="carssite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
# Compress from tar to zip with AES256 protection password used is : tER@klo982S@eu
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

# Next site
fulldate="`date`"
domanvalue="busisite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

# Final site
fulldate="`date`"
domanvalue="personalsite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

#Backup Mysql
fulldate="`date`"

# Avoid Cpanel restarting the services
echo "Cpanel TailWatch switching off" $fulldate
/usr/local/cpanel/bin/tailwatchd --disable=Cpanel::TailWatch::ChkServd

fulldate="`date`"
echo "Mysql switching off" $fulldate
# Stop Mysql service
/etc/init.d/mysql stop


# Move to my backupfolder
fulldate="`date`"
cd /home/backup

# Compact Mysql folder to tar file
echo "tar mysql folder at:  /var/lib/mysql " $fulldate
tar -zcf mysql-$backupdate.tar /var/lib/mysql
fulldate="`date`"
echo "start mysql" $fulldate
# Strat Mysql service
/etc/init.d/mysql start

fulldate="`date`"
echo "start cpanel TailWatch again" $fulldate

# Strat Cpanel watch again
/usr/local/cpanel/bin/tailwatchd --enable=Cpanel::TailWatch::ChkServd

# Preparing to Copy mysql to cloud
cd /home/backup

fulldate="`date`"
domanvalue="mysql"
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

echo "Backup completed successfully" $fulldate
# Calculate how long the script took to run
end=$(date +%s.%N)
runtime=$(python -c "print ${end} - ${start}")
echo "Runtime was $runtime"

 

Download script


Here is another version of my backup script no Amazon cloud is involved but I have improved Mysql backup and md5 calculating:

#!/bin/sh
# Shell script written by W. Al Maawali    
# (c) 2016 Founder of Eagle Eye Digital Solutions
# http://www.digi77.com
# http://www.om77.net
# script starts here:
start=$(date +%s.%N)
fulldate="`date`"
bakdate="`date +%y%m%d`"
echo "Go to /home/backup and run rm -f * " $fulldate
# Remove any left over from previous backups
OUTPUT="/home/backup"
cd $OUTPUT
rm -f $OUTPUT/*.tar
rm -f $OUTPUT/*.zip


# Backup all hosts

# Digi77 backup
fulldate="`date`"
domanvalue="digi77"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$bakdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$bakdate.tar with 7zip" $fulldate
7za a -tzip -pYourzippasswordgoeshere -mem=AES256 $domanvalue-$bakdate.zip $domanvalue-$bakdate.tar
rm -f $domanvalue-$bakdate.tar
echo "";



# Prepare mysql and rest
fulldate="`date`"
domanvalue="mysql"
echo "Backing mysql folder at: /home/mysql " $fulldate

# MySQL User
USER='root'
# MySQL Password
PASSWORD='YourDBPasswordGoesHere'
# Backup Directory - NO TAILING SLASH!

 
start=$(date +%s.%N)
fulldate="`date`"
bakdate="`date +%y%m%d`"
cd $OUTPUT

echo "Starting MySQL Backup";
echo `date`;
databases=`mysql --user=$USER --password=$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != _* ]] ; then
        echo "Dumping database: $db"
        mysqldump --force --opt --user=$USER --password=$PASSWORD --databases $db > $OUTPUT/$domanvalue-$bakdate-$db.sql
	    7za a -tzip -pYourzippasswordgoeshere -mem=AES256 $domanvalue-$bakdate-$db.zip $domanvalue-$bakdate-$db.sql
		rm -f $OUTPUT/$domanvalue-$bakdate-$db.sql
    fi
done
echo "Finished MySQL Backup";

 
echo "Getting MD5 Hash from the zip file list:" $fulldate
md5sum /home/backup/*.zip

fulldate="`date`"
echo "Backup completed successfully" $fulldate
end=$(date +%s.%N)
runtime=$(python -c "print ${end} - ${start}")
echo "Runtime was $runtime"


 

Download script


 

Notes:

 

To test the script:

. backup.sh

For my self I have set a cron job to run it every Sunday as the following:

15 5 * * 0 bash /home/xxx/week

If the archive is to large then it might not fit into your system /tmp directory which gpg uses to store temporary files therefore you will need to use the following to avoid errors:

s3cmd put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip

Instead of:

s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip

 
For cloud backup I personally use the Amazon RRS S3 then move it to S3 Glacier Amazon account along with CloudBerry with client based encryption because I want more control over a reliable service. To know more about cloud storage please visit this post.

Latest update I have moved from Amazon services to Google Cloud and I am so happy with it. You can manage it with CloudBerry Backup tool.


Digiprove sealCopyright protected by Digiprove © 2014-2016 Eagle Eye Digital Solutions
JOIN OUR NEWSLETTER
Amazing people have subscribed to our newsletter — and you’re amazing too!
We hate spam. Your email address will not be sold or shared with anyone else.
The following two tabs change content below.
Warith Al Maawali
W. AL Maawali is the Founder and Chief Editor of Eagle Eye Digital Solutions from the Sultanate of Oman with over 20 years experience in Security and Digital Forensics. He is also the Founder of om77.net.
Linux is, in simplest terms, an operating system. It is the software on a computer that enables applications and the computer operator to access the devices on the computer to perform desired functions. The operating system (OS) relays instructions from an application to, for instance, the computer’s processor.
Cloud computing is is typically defined as a type of computing that relies on sharing computing resources rather than having local servers or personal devices to handle applications.
Amazon S3 (Simple Storage Service) is an online file storage web service offered by Amazon Web Services. Amazon S3 provides storage through web services interfaces (REST, SOAP, and BitTorrent).
Encryption is the most effective way to achieve data security. To read an encrypted file, you must have access to a secret key or password that enables you to decrypt it. Unencrypted data is called plain text ; encrypted data is referred to as cipher text.
commentJoin the Discussion

Pin It on Pinterest