LowEndBox - Cheap VPS, Hosting and Dedicated Server Deals

Encrypted backups with duplicity


Backups are about the most important thing about managing a server. If anything happens to your server, like data loss, a customer making a mistake (and accidentally removing data) or a full-blown server crash, you’ll be happy to have backups. There’s plenty of people that don’t make them though. For some it may feel to complicated, for some it may feel like to much work and others may just not care.

I’m going to show you how to easily make (and restore) encrypted incremental backups with Duplicity. Duplicity is a tool that uses tar, librsync and GnuPG to backup up data. All your data is encrypted, so nobody else can look at it (except when they have your GnuPG private key and password). While Duplicity is _officially_ still in beta, it’s been considered stable enough by many (including me) for the past couple of years. In addition to that, it powers the default backup tool in Ubuntu (Deja Dup, which is built on Duplicity). Duplicity support the following protocols for connecting to a remote filesystem: ssh/scp, local file access, rsync, ftp, HSI, WebDAV, Tahoe-LAFS, and Amazon S3.

In this guide, I’m going to assume we’re backup up to a backup server using SFTP. All examples should work on both Ubuntu (or other Debian-bases distributions) and CentOS (or RHEL/Fedora).

# Installing duplicity

First things first. Let’s install duplicity. It’s simple enough. On Ubuntu:

sudo apt-get install duplicity

On CentOS (you need EPEL for this):

yum install duplicity python-paramiko


Creating a GnuPG key

Duplicity both encrypts and signs your backups with your GnuPG key. Your backups are encrypted for privacy and security and signed for detecting changes (and thus being able to do incremental backups). Creating a GnuPG key is pretty simple. You do need GnuPG installed, though. On Ubuntu the package is named ‘gnupg’ (sudo apt-get install gnupg) and on CentOS ‘gnupg2’ (yum install gnupg2). In my experience GnuPG is almost always present, even on minimal installations.

Let’s get started and create our key. Run this as the user that’s making your backups (usually root):

gpg –gen-key

This will ask you several questions. It starts with the key type and size:

What kind of key you want: ‘RSA and RSA’
What keysize you want: 4096

The key default key type is fine and most secure (RSA with an RSA subkey). For key length, I always go for 4096 bits these days, as it’s the most secure option.

How long the key must be valid: 0 (infinity)
If you really want a non-expiring key: Y

Usually, I would not recommend a non-expiring key. However, if you want your backups to last, you’re better off with this option. An expired GnuPG key cannot be used for encryption, just for decryption. So from the moment of expiration, all new backups would be encrypted and signed with a different key. This is really undesirable, because it will break the incremental chain and thus, history. The alternative would be re-encrypting and re-signing your backups every x years (for example) because your key has expired. If you have terabytes of backups, this is not something you want to do.

Let’s go on with identification for the key. I usually give they key the name “Duplicity Backup”. That way you can always identify the proper key, but feel free to give it another name. The e-mail address is completely up to you as well, as is the comment. These name, e-mail address and comment won’t be used for anything other than identifying your key in a list of keys. Confirm the information at the end.

Real name: “Duplicity Backup”
Email address: operations@example.net
Comment: (none)
Confirmation: O for Okay

Finally, it will ask you for a passphrase (twice). Pick a very secure on here, as (combined with your GnuPG private key) it is the gateway to your data:

Passphrase: <your very secure passphrase>
Repeat passphrase: <your very secure passphrase>

After that, GnuPG needs to generate random bytes and it needs entropy for that. On my desktop this was not an issue, but on an idling server it proofed challenging. Just run a lot of DD tests or do some heavy stuff in a different terminal and you should be fine. When done, GnuPG gives you information about the key you’ve just created:

gpg: /root/.gnupg/trustdb.gpg: trustdb created
gpg: key 1731EA9E marked as ultimately trusted
public and secret key created and signed.

gpg: checking the trustdb
gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
pub 4096R/1731EA9E 2013-05-21
Key fingerprint = 867B 4675 2693 CA7C 34D9 E394 FDF1 0274 1731 EA9E
uid Duplicity Backup <operations@example.net>
sub 4096R/79FA3B1E 2013-05-21

The key ID that it lists before ‘marked as ultimately trusted’ on the second like is very important. That’s the public key ID and we will use it to tell Duplicity which key to use for encryption and signing. In this case its: 1731EA9E.

# Creating backups

Now that we’ve got our key ready, we can start making backups! What we’re going to do is back up a folder of files to a remote filesystem, using SFTP, and then list the files to see what has happened. As an example, I’m going to backup my mission-critical wallpaper collection.

From the directory which contains the folder that you are going to back up, run:

duplicity –encrypt-key=1731EA9E –sign-key=1731EA9E mission-critical-wallpapers sftp://backupserver.example.net/mission-critical-wallpapers

What we’ve done here, is tell duplicity to use our recently-created key for encrypting (–encrypt-key) and signing (–sign-key) the backups. In addition to that, we’ve told it which folder on the local filesystem to use (mission-critical-wallpapers) and where to put in on which remote filesystem (sftp://backupserver.example.net/mission-critical-wallpapers). The path on the remote filesystem is relative to your home directory on the remote filesystem. When executed, it starts running Duplicity and will ask you for your GnuPG key password twice (for both encryption and signing):

Local and Remote metadata are synchronized, no sync needed.
Last full backup date: none
GnuPG passphrase:
GnuPG passphrase for signing key:
No signatures found, switching to full backup.
————–[ Backup Statistics ]————–
StartTime 1369202758.42 (Wed May 22 08:05:58 2013)
EndTime 1369202758.82 (Wed May 22 08:05:58 2013)
ElapsedTime 0.40 (0.40 seconds)
SourceFiles 10
SourceFileSize 6895171 (6.58 MB)
NewFiles 10
NewFileSize 6895171 (6.58 MB)
DeletedFiles 0
ChangedFiles 0
ChangedFileSize 0 (0 bytes)
ChangedDeltaSize 0 (0 bytes)
DeltaEntries 10
RawDeltaSize 6891075 (6.57 MB)
TotalDestinationSizeChange 6809236 (6.49 MB)
Errors 0

Lucky for me, my collection of mission-critical wallpapers is not that large. However, if you have a lot to back up, now is the time to grab some coffee (and do something else while you’re at it). The initial backup can take quite a while, but the incremental backups afterwards will be a lot faster. Duplicity doesn’t display any progress, so after typing your password it just runs. To see some progress, go to the remote filesystem and ‘watch du -sh’ the target folder folder. This will show you how large the target folder is.

Incremental backups are pretty smart. Duplicity only backs up what has changed and keeps track of it. So you will always be able to restore a full and complete copy of your data, exactly the way it was the last time you’ve backed it up (or another moment in time). So if you add one wallpaper to that folder, it will back up just that one wallpaper. When restoring the backup completely, it will include that one wallpaper as well. You can also choose to restore an older version of your backup, before that wallpaper was made. Let’s add a wallpaper to the folder and run duplicity again:

duplicity –encrypt-key=1731EA9E –sign-key=1731EA9E mission-critical-wallpapers sftp://user@backupserver.example.net/mission-critical-wallpapers

The output is now different:

Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Wed May 22 08:05:50 2013
GnuPG passphrase:
GnuPG passphrase for signing key:
————–[ Backup Statistics ]————–
StartTime 1369202811.55 (Wed May 22 08:06:51 2013)
EndTime 1369202811.99 (Wed May 22 08:06:51 2013)
ElapsedTime 0.44 (0.44 seconds)
SourceFiles 18
SourceFileSize 14414631 (13.7 MB)
NewFiles 9
NewFileSize 7523556 (7.18 MB)
DeletedFiles 0
ChangedFiles 0
ChangedFileSize 0 (0 bytes)
ChangedDeltaSize 0 (0 bytes)
DeltaEntries 9
RawDeltaSize 7519460 (7.17 MB)
TotalDestinationSizeChange 7456505 (7.11 MB)
Errors 0

Compared to the statistics on the initial run, you now see an increased SourceFileSize. The NewFiles and NewFileSize gives an overview on what has changed. These statistics can be actually quite useful to see if you backup has succeeded.

Congratulations! You’ve just made your first backup with Duplicity! Now let’s restore it…

Restoring backups

Now that we’ve backed up our files they are safe, we are going to restore the backup to see if the backup has actually worked. Before we’re going to restore them, let’s have a look what is in the backup:

duplicity list-current-files sftp://user@backupserver.example.net/mission-critical-wallpapers

This produces a list of the files in the backup (the most recent version). It does this based solely on the signature files, so it doesn’t have to download or access the complete backup. You’ll get a list that looks like:

Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Wed May 22 08:05:50 2013
Wed May 22 08:06:40 2013 .
Wed May 22 08:04:13 2013 02140_romanbath_2560x1600.jpg
Wed May 22 08:04:14 2013 02141_auroraborealis_2560x1600.jpg
Wed May 22 08:04:13 2013 02142_lakechapalajaliscomexico_2560x1600.jpg
Wed May 22 08:04:13 2013 02143_sonicboom_2560x1600.jpg
Wed May 22 08:04:13 2013 02144_islate_2560x1600.jpg
Wed May 22 08:04:13 2013 02145_powerfulsunrise_2560x1600.jpg
Wed May 22 08:04:13 2013 02146_newyork_2560x1600.jpg
Wed May 22 08:04:12 2013 02147_thefence_2560x1600.jpg
Wed May 22 08:04:14 2013 02148_sunsetintuscanysaturdaymay23rd2009_2560x1600.jpg
Wed May 22 08:06:40 2013 02330_morainelakepanorama_2560x1600.jpg
Wed May 22 08:06:40 2013 02333_firstlight_2560x1600.jpg
Wed May 22 08:06:40 2013 02334_portomoniz89s_2560x1600.jpg
Wed May 22 08:06:40 2013 02335_alpsteinbeforerain_2560x1600.jpg
Wed May 22 08:06:40 2013 02336_leavesatlynncanyonpark_2560x1600.jpg
Wed May 22 08:06:40 2013 02337_seasonofillusions_1920x1200.jpg
Wed May 22 08:06:40 2013 02338_yellowstonesunset_2560x1600.jpg
Wed May 22 08:06:40 2013 02339_thetimeofsilence_2560x1600.jpg

Now, that looks good! Let’s restore these backups:

duplicity restore sftp://user@backupserver.example.net/mission-critical-wallpapers restored-mission-critical-wallpapers

Duplicity will now restore your backups to the local path you’ve given. Once completed, it will give you a summary of the run:

Local and Remote metadata are synchronized, no sync needed.
Last full backup date: Wed May 22 08:05:50 2013
GnuPG passphrase:

And you’re done!

Final note

I’ve now shown you how to do a basic backup and restore with Duplicity. However, it has a lot more options than I’ve just shown. In addition to that, you may want to automate your backups. I’ll go into detail about that in a next article, as it’s a subject on its own. For now, happy back-upping!



  1. Nice guide. I’m sure many will find this quite useful.

    May 25, 2013 @ 10:32 am | Reply
  2. Definitely a great guide now that there are so many storage and backup VPS offers around.

    May 25, 2013 @ 11:17 am | Reply
    • Maarten Kossen:

      Thanks! And may I use this opportunity to recommend your storage servers. They’re absolutely awesome!

      May 25, 2013 @ 1:46 pm | Reply
  3. Good tutorial. Keep it coming broer.

    May 25, 2013 @ 11:26 am | Reply
  4. pechspilz:

    Using duply as a frontend for duplicity is so much easier.

    May 25, 2013 @ 2:43 pm | Reply
  5. saltspork:

    rdiff-backup is also another good option. Recovering deleted files from old increments is crazy easy and it doesn’t require separate key setup. Normal SSH keys do the trick.

    May 26, 2013 @ 1:35 pm | Reply
  6. florin:

    nice tutorial, encrypted backups are very nice when dealing with sensitive information (sql databases,etc)

    May 26, 2013 @ 9:45 pm | Reply
  7. I used duplicity, but traded it in for obnam which has additional features like multiple backup generations and data de-duplication in addition to incremental GPG backups.

    May 27, 2013 @ 4:15 am | Reply
    • tdc-adm:

      Thank you for introducing me obnam. I went to its website then read a long good story of it author. It’s worth to try. Good news: it’s ready in Debian Wheezy :)

      May 28, 2013 @ 1:42 am | Reply
  8. W1V_Lee:

    Another good one, thanks. Ran it through quickly on my test setup and it worked smoothly.

    June 1, 2013 @ 9:31 am | Reply
  9. I use Duplicity to do backups into Amazon S3, works quite well. I use it along with Backupninja which handles automating the backups.

    June 2, 2013 @ 9:57 am | Reply
  10. gurabli:

    Excellent guide, many thanks!

    One question related to the sftp server. What happens if the sftp server is down and not available when duplicity wants to connect and do a scheduled backup? Is there a way to define the number of retries, or the job will fail and will try to upload backups when the next backup is scheduled?

    I ask this, since my friend is running an sftp server where I backup my data (actually to 3 different places), and by this I can use it for free and have 3 different places copies of my important data. However, they do not keep their PC on all the time.

    Also, what happens if the sftp server goes down when duplicity us uploading? Will it resume on the next time the sftp server is available or not?


    March 20, 2015 @ 7:58 am | Reply
  11. Eliot:

    I think it would be *very* important to back up the GPG private key used for decryption? E.g. on a usb flash drive, and maybe for last resort print it out and store it offsite… ;)

    April 6, 2017 @ 10:09 pm | Reply

Leave a Reply

Some notes on commenting on LowEndBox:

  • Do not use LowEndBox for support issues. Go to your hosting provider and issue a ticket there. Coming here saying "my VPS is down, what do I do?!" will only have your comments removed.
  • Akismet is used for spam detection. Some comments may be held temporarily for manual approval.
  • Use <pre>...</pre> to quote the output from your terminal/console, or consider using a pastebin service.

Your email address will not be published. Required fields are marked *