Warith AL Maawali

0 %
Warith AL Maawali
Driving cybersecurity excellence
Innovator behind Linux Kodachi
  • Residence:
    127.0.0.1
  • Uptime Binary:
    101110
  • Mantra:
    Innovate, Secure, Repeat
ONS
EEDS
NSSG
Visual basic
Delphi
Gambas
Bash
PHP
  • Infrastructures
  • Digital Forensics
  • Cryptocurrency
  • Cloud & Server Management

Automate backups from Linux servers to Amazon cloud

02/11/2014

Automating Web Hosting Server Backups with High-Level Encryption

The aim of this article is to show how I successfully managed to automate the process of backing up a web hosting Linux server, host by host, including MySQL data. This was achieved by compressing and protecting the archive with a high level of encryption before transferring it to the Amazon cloud. To fully understand the script, you will need basic knowledge of Linux commands, as well as the installation of the s3cmd tool and 7zip. My experiment was conducted on Linux CentOS 6.5, and I had already obtained an Amazon S3 account, so make sure you get one as well.

A good question was raised on the Google+ discussion board: Is it safe to upload your backups to the cloud? My answer is yes, as long as you know what you are doing. I took the risk after ensuring that Amazon provides server-side encryption. Furthermore, to enforce the TNO (Trust No One) concept, my script implements two levels of encryption before the upload phase. The first level uses AES-256 bit encryption via 7zip, and the second level uses GnuPG with the CAST5 algorithm. With these measures, I believe there is no way anyone can access the backup content unless they can guess my 128-character gibberish random passwords used in the three different encryption methods.

Installation: 

First you will need to install the s3cmd tool:
# You will be asked to accept a new GPG key – answer yes (perhaps twice).

cd  /etc/yum.repos.d
wget http://s3tools.org/repo/RHEL_6/s3tools.repo
yum install s3cmd 

To configure s3cmd:
# Make sure you set https on with gpg enabled

s3cmd --configure

Run s3cmd ls to list all your buckets:

s3cmd ls

Make a bucket with s3cmd mb s3://my-new-bucket-name:

s3cmd mb s3://digi77backups

List the contents of the bucket:

s3cmd ls s3://digi77backups

Upload a file into the bucket (plain):

s3cmd put chatcleaner s3://digi77backups/chatcleaner

Upload a file into the bucket (PGP encryption):

s3cmd -e put chatcleaner s3://digi77backups/chatcleaner

Retrieve the file back and verify that it hasn’t been corrupted:

s3cmd get s3://digi77backups/chatcleaner chatcleaner-2 
md5sum chatcleaner chatcleaner-2

Clean up: delete the object and remove the bucket:

s3cmd del s3://digi77backups/chatcleaner s3://digi77backups/chatcleaner2 s3://digi77backups/chatcleaner3
s3cmd rb s3://digi77backups

For more commands check the following links 1 2.
Now that you have learned how to control your bucket via your server let us move to the next step.

Install 7zip tool:

rpm --import http://apt.sw.be/RPM-GPG-KEY.dag.txt
Check machine architecture: uname -i
wget http://pkgs.repoforge.org/rpmforge-release/rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
rpm -K rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
rpm -i rpmforge-release-0.5.3-1.el6.rf.x86_64.rpm
yum -y install p7zip

To create an encrypted zip file with 7z archive:

7za a -tzip -pMY_SECRET -mem=AES256 secure.zip doc.pdf doc2.pdf doc3.pdf 

To extract a password protected file:

7za x thefile.zip -pThePassword

The Script: 

I will be explaining the script through the lines with the comment sybmol #

#!/bin/sh
# Shell script written by W. Al Maawali    
# (c) 2014 Founder of Eagle Eye Digital Solutions
# https://www.digi77.com
# http://www.om77.net
# script starts here:

# Script timer variable
start=$(date +%s.%N)

# get the current time and date
fulldate="`date`"

# Set the backup date in Year - Month - Day format
backupdate="`date +%y%m%d`"

echo "Go to /home/backup and run rm -f * " $fulldate
cd /home/backup

# Remove any left over from previous backups
rm -f *.tar
rm -f *.zip

# Backup all 3 hosts carssite - busisite - personalsite : Those names are just examples
cd /home/backup

fulldate="`date`"
domanvalue="carssite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
# Compress from tar to zip with AES256 protection password used is : tER@klo982S@eu
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

# Next site
fulldate="`date`"
domanvalue="busisite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

# Final site
fulldate="`date`"
domanvalue="personalsite"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$backupdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

#Backup Mysql
fulldate="`date`"

# Avoid Cpanel restarting the services
echo "Cpanel TailWatch switching off" $fulldate
/usr/local/cpanel/bin/tailwatchd --disable=Cpanel::TailWatch::ChkServd

fulldate="`date`"
echo "Mysql switching off" $fulldate
# Stop Mysql service
/etc/init.d/mysql stop


# Move to my backupfolder
fulldate="`date`"
cd /home/backup

# Compact Mysql folder to tar file
echo "tar mysql folder at:  /var/lib/mysql " $fulldate
tar -zcf mysql-$backupdate.tar /var/lib/mysql
fulldate="`date`"
echo "start mysql" $fulldate
# Strat Mysql service
/etc/init.d/mysql start

fulldate="`date`"
echo "start cpanel TailWatch again" $fulldate

# Strat Cpanel watch again
/usr/local/cpanel/bin/tailwatchd --enable=Cpanel::TailWatch::ChkServd

# Preparing to Copy mysql to cloud
cd /home/backup

fulldate="`date`"
domanvalue="mysql"
echo "Protecting $domanvalue-$backupdate.tar with 7zip" $fulldate
7za a -tzip -ptER@klo982S@eu -mem=AES256 $domanvalue-$backupdate.zip $domanvalue-$backupdate.tar
rm -f $domanvalue-$backupdate.tar
echo "Sending $domanvalue-$backupdate.zip to the cloud" $fulldate
# Copy the file to the Amazon s3 
s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip
echo "Calculating md5sum hash" $fulldate
md5sum $domanvalue-$backupdate.zip
rm -f $domanvalue-$backupdate.zip

echo "Backup completed successfully" $fulldate
# Calculate how long the script took to run
end=$(date +%s.%N)
runtime=$(python -c "print ${end} - ${start}")
echo "Runtime was $runtime"

 

Download script

Here is another version of my backup script no Amazon cloud is involved but I have improved Mysql backup and md5 calculating:

#!/bin/sh
# Shell script written by W. Al Maawali    
# (c) 2016 Founder of Eagle Eye Digital Solutions
# https://www.digi77.com
# http://www.om77.net
# script starts here:
start=$(date +%s.%N)
fulldate="`date`"
bakdate="`date +%y%m%d`"
echo "Go to /home/backup and run rm -f * " $fulldate
# Remove any left over from previous backups
OUTPUT="/home/backup"
cd $OUTPUT
rm -f $OUTPUT/*.tar
rm -f $OUTPUT/*.zip


# Backup all hosts

# Digi77 backup
fulldate="`date`"
domanvalue="digi77"
echo "tar $domanvalue at: /home/$domanvalue " $fulldate
tar -zcf $domanvalue-$bakdate.tar /home/$domanvalue
echo "Protecting $domanvalue-$bakdate.tar with 7zip" $fulldate
7za a -tzip -pYourzippasswordgoeshere -mem=AES256 $domanvalue-$bakdate.zip $domanvalue-$bakdate.tar
rm -f $domanvalue-$bakdate.tar
echo "";



# Prepare mysql and rest
fulldate="`date`"
domanvalue="mysql"
echo "Backing mysql folder at: /home/mysql " $fulldate

# MySQL User
USER='root'
# MySQL Password
PASSWORD='YourDBPasswordGoesHere'
# Backup Directory - NO TAILING SLASH!

 
start=$(date +%s.%N)
fulldate="`date`"
bakdate="`date +%y%m%d`"
cd $OUTPUT

echo "Starting MySQL Backup";
echo `date`;
databases=`mysql --user=$USER --password=$PASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
for db in $databases; do
    if [[ "$db" != "information_schema" ]] && [[ "$db" != _* ]] ; then
        echo "Dumping database: $db"
        mysqldump --force --opt --user=$USER --password=$PASSWORD --databases $db > $OUTPUT/$domanvalue-$bakdate-$db.sql
	    7za a -tzip -pYourzippasswordgoeshere -mem=AES256 $domanvalue-$bakdate-$db.zip $domanvalue-$bakdate-$db.sql
		rm -f $OUTPUT/$domanvalue-$bakdate-$db.sql
    fi
done
echo "Finished MySQL Backup";

 
echo "Getting MD5 Hash from the zip file list:" $fulldate
md5sum /home/backup/*.zip

fulldate="`date`"
echo "Backup completed successfully" $fulldate
end=$(date +%s.%N)
runtime=$(python -c "print ${end} - ${start}")
echo "Runtime was $runtime"


 

Download script

Notes: 

To test the script:

. backup.sh

For my self I have set a cron job to run it every Sunday as the following:

15 5 * * 0 bash /home/xxx/week

If the archive is to large then it might not fit into your system /tmp directory which gpg uses to store temporary files therefore you will need to use the following to avoid errors:

s3cmd put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip

Instead of:

s3cmd -e put $domanvalue-$backupdate.zip s3://mybackups/$domanvalue-$backupdate.zip

 
For cloud backup, I personally use Amazon RRS S3 and then move the data to an S3 Glacier Amazon account, along with CloudBerry for client-based encryption, because I want more control over a reliable service. To learn more about cloud storage, please visit this post.

In the latest update, I have transitioned from Amazon services to Google Cloud and I am very satisfied with the change. You can manage Google Cloud backups with the CloudBerry Backup tool.


Posted in Tech BlogTags:
© 2024 Warith AL Maawali. All Rights Reserved.
Stay Secure, Stay Assured.