- Public key (pubring.gpg) – Used to encrypt your data. It doesn’t matter who sees this.
- Private key (secring.gpg) – Used to decrypt your data. This file must be kept safe and only seen by you.
The two files it creates are essentially a pair. Files encrypted by a public key can only be decrypted by the corresponding private key. If you lose your private key, you will not get your files back, ever. So, let’s get to it!
- In your command line, type the following:
You’ll be walked through a few options for your key, select the following:
- Key type – DSA and Elgamal (Default)
- Key size – 2048 bits (Again, the default)
- Expiration – Do not expire (Not necessary for what we’re doing as you won’t be sharing the public key with anyone).
- Name, Comment and Email – You can enter whatever you like here, but do take a note of them
Name: Branz Balugo
Comment: domain.com
Email: myemailaddress@gmail.com
- Password (gpg passphrase) – Make sure you remember whatever you type, there’s no way to get it back if you forget! The password (gpg passphrase) used here was: s3cr3t
- When it talks about “generating entropy” to make the key, it means that the server needs to be in use in order for it to get some random numbers. Just go refresh a webpage on the server a few times, or run some commands in another terminal window.
pub 1024D/2A72DARB 2010-09-16
The 2A72DARB is the part you need. That’s your gpg key ID, and you’ll need it later!
If you do end up forgetting your gpg key ID though, it’s easy enough to get that back. Just type:
sudo gpg --list-keys
That’s our encryption set up and is now ready to use!
NOTE: The keys generated will be stored in your home directory, under the .gnupgsub-directory.
2.) Sign up for AWS3 (which is already done).
You should have the following:
- username: P!N4K4GWAPONG4T4W0
- password:batiUUkagXv2UEndagwayXWqYc2WS8CVaXpangitK0n6+wm
- bucket_name: com.sym.backups.domain
3.) Install Duplicity (if it's not installed)
Enter the following command (yum is specific to Red Hat and its derivatives, like Fedora):
sudo yum install duplicity
Now with it installed, we just have to create a script that tells it how to run.
4.) Our Duplicity backup script (backup_aws3)
Here is how we want to set it up:
- Encrypt with our GPG key.
- Backup to an Amazon s3 “bucket” (a bucket on s3 is like a folder).
- Make an incremental backup every day.
- Make a full backup if it’s been more than 2 weeks since our last full backup.
- Remove backups older than one month.
Now, create a file (that will contain the backup script) in your home directory as,
nano backup_aws3
where backup_aws3 is the script's filename and then enter the following lines:
#!/bin/sh
config_file=$1 # Config file specified at command prompt
if [ -e "$config_file" ]
then
echo "--- Reading config file ($config_file) ---"
username=`grep username $config_file | awk '{ print $2 }'`
password=`grep password $config_file | awk '{ print $2 }'`
bucket_name=`grep bucket_name $config_file | awk '{ print $2 }'`
gpg_passphrase=`grep gpg_passphrase $config_file | awk '{ print $2 }'`
gpg_key=`grep gpg_key $config_file | awk '{ print $2 }'`
backup_folder=`grep backup_folder $config_file | awk '{ print $2 }'`
echo "--- done ---"
echo "--- Exporting environment variables ---"
export PASSPHRASE=$gpg_passphrase
export AWS_ACCESS_KEY_ID=$username
export AWS_SECRET_ACCESS_KEY=$password
echo "--- done ---"
echo "--- Looking for backup files older than 1 month for
deletion ---"
# Delete any file older than 1 month
duplicity remove-older-than 1M --encrypt-key=$gpg_key --sign-key=$gpg_keys3+http://$bucket_name
echo "--- done ---"
echo "--- Doing a regular backup (will do a full backup if past 14 days) ---"
# Make the regular backup
# Will be a full backup if past the older-than parameter
duplicity --full-if-older-than 14D --encrypt-key=$gpg_key --sign-key=$gpg_key $backup_folder/ s3+http://$bucket_name
echo "--- done ---"
export PASSPHRASE=
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
else
echo "Configuration file does not exist!"
echo "Usage: $0
fi
and then save it. Now, you need to change its file permission mode so that it becomes
executable, by typing the following command:
chmod a+x backup_aws3
NOTE: This script has to be specified with a configuration file (say, backup_aws3.cfg) as its parameter when invoked. So create the configuration file as follows:
nano backup_aws3.cfg
where backup_aws3.cfg is the name of the configuration file, and then enter the following lines:
username: P!N4K4GWAPONG4T4W0
password: batiUUkagXv2UEndagwayXWqYc2WS8CVaXpangitK0n6+wm
bucket_name: com.sym.backups.domain
gpg_passphrase: s3cr3t
gpg_key: 2A72DARB
backup_folder: /home/admin/backups
restore_folder: /home/admin/restore
where 2A72DARB is your gpg key ID generated above and then save it. To run the script on the command line, simply type the following:
./backup_aws3 backup_aws3.cfg
5.) Finally, run the script via crontab to automate backups.
To do a backup every 3:30AM of every day, just run the script via crontab like this:
export EDITOR=nano
crontab -e
and once the nano text editor comes up, enter the following on one line:
30 3 * * * /home/admin/backup_aws3 /home/admin/backup_aws3.cfg > /home/admin/backup_aws3.log 2>&1
and then save it. The backup script /home/admin/backup_aws3 will now be run by crontab every 3:30AM of everyday.
RESTORING THE BACKUP FILES
To restore the backup files stored at AWS3 to a specified restore_folder (see its set value in
backup_aws3.cfg) directory, create the following script as,
nano restore_aws3
and then enter the following lines:
#!/bin/sh
config_file=$1 # Config file specified at command prompt
if [ -e "$config_file" ]
then
echo "--- Reading config file ($config_file) ---"
username=`grep username $config_file | awk '{ print $2 }'`
password=`grep password $config_file | awk '{ print $2 }'`
bucket_name=`grep bucket_name $config_file | awk '{ print $2 }'`
gpg_passphrase=`grep gpg_passphrase $config_file | awk '{ print $2 }'`
gpg_key=`grep gpg_key $config_file | awk '{ print $2 }'`
restore_folder=`grep restore_folder $config_file | awk '{ print $2 }'`
echo "--- done ---"
echo "--- Exporting environment variables ---"
export PASSPHRASE=$gpg_passphrase
export AWS_ACCESS_KEY_ID=$username
export AWS_SECRET_ACCESS_KEY=$password
echo "--- done ---"
echo "--- Restoring backup files to $restore_folder directory ---"
duplicity s3+http://$bucket_name $restore_folder/ --encrypt-key=$gpg_key --signkey=$
gpg_key
echo "--- done ---"
export PASSPHRASE=
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
else
echo "Configuration file does not exist!"
echo "Usage: $0
fi
and then save it. Now, you need to change its file permission mode so that it becomes executable, by typing the following command:
chmod a+x restore_aws3
NOTE: This script has to be specified with a configuration file (say,backup_aws3.cfg) as its
parameter when invoked. To restore your backup files from AWS3 torestore_folder directory,
run the following command:
./restore_aws3 backup_aws3.cfg
IMPORTANT: Since your backup script will be run by your admin account viacrontab and
your restore script will be run manually via command line, make sure that the.gnupg sub-directory,
which contains the keys, is present in admin's home directory. Also, the backup/restore script and its configuration file must be referred to using full path specification as
/home/admin/backup_aws3 (or /home/admin/restore_aws3) and
/home/admin/backup_aws3.cfg.
MOST IMPORTANT: BACKUP YOUR KEYS!!!
The duplicity-encrypted backups that you store will NOT be usable unless you have your gpg secret
key. Therefore it is important that you keep a copy of that key somewhere besides the system on
which your original data resides on. To export your public and private (secret) keys, run these
commands:
- For the Public Key, enter:
gpg -ao public-key-domain.com --export 2A72DARB
- For the Private (Secret) Key, enter:
gpg -ao secret-key-domain.com --export-secret-keys 2A72DARB
where 2A72DARB is your gpg key ID.
and then store the "public-key-domain.com" and "secret-key-domain.com" files that
you created somewhere safe, like a CD-ROM in a safe deposit box, etc.
References: