Celeb Glow
news | March 18, 2026

Comparison of backup tools

This question exists because it has historical significance, but it is not considered a good, on-topic question for this site, so please do not use it as evidence that you can ask similar questions here. While you are encouraged to help maintain its answers, please understand that "big list" questions are not generally allowed on Ask Ubuntu and will be closed per the help center.

Backup is incredibly important. Obviously there's no best backup tool, but a comparison of the options would be very interesting.

  • Graphical Interface? Command line?
  • Incremental backups?
  • Automatic backups?
  • Install method: In standard repositories? PPA?
2

38 Answers

12

Déjà Dup Install Déjà Dup

Déjà Dup is (from Ubuntu 11.10) installed by default. It is a GNOME tool intended for the casual Desktop user that aims to be a "simple backup tool that hides the complexity of doing backups the Right Way".

It is a front end to duplicity that performs incremental backups, where only changes since the prior backup was made are stored. It has options for encrypted and automated backups. It can backup to local folders, Amazon S3, or any server to which Nautilus can connect.

Integration with Nautilus is superb, allowing for the restoration of files deleted from a directory and for the restoration of an old version of an individual file.

Main Window Screenshot

Restore earlier version of file

Note that as of February 2016 this project appears to be almost completely ignoring bug reports with only minor triage activity and the last bugfix dates back to 2014, though there are new releases with minor changes.

11

Back in Time Install Back in Time

I have been using Back in Time for some time, and I'm very satisfied.

All you have to do is configure:

  • Where to save snapshot
  • What directories to backup
  • When backup should be done (manual, every hour, every day, every week, every month)

And forget about it.

To install (working on Ubuntu 16.04 for gnome):

sudo add-apt-repository ppa:bit-team/stable
sudo apt-get update
sudo apt-get install backintime-gnome

The program GUI can be opened via ubuntu search for "backintime".

alt text

Project is active as of August 2019.

8

rsnapshot vs. rdiff-backup

I often refer to this comparison of rsnapshot and rdiff-backup:

Similarities:

  • both use an rsync-like algorithm to transfer data (rsnapshot actually uses rsync; rdiff-backup uses the python librsync library)
  • both can be used over ssh (though rsnapshot cannot push over ssh without some extra scripting)
  • both use a simple copy of the source for the current backup

Differences in disk usage:

  • rsnapshot uses actual files and hardlinks to save space. For small files, storage size is similar.
  • rdiff-backup stores previous versions as compressed deltas to the current version similar to a version control system. For large files that change often, such as logfiles, databases, etc., rdiff-backup requires significantly less space for a given number of versions.

Differences in speed:

  • rdiff-backup is slower than rsnapshot because of its need to calculate delta files. There are ways to speed it up, though, like the --no-fsync and --no-compression options.

Differences in metadata storage:

  • rdiff-backup stores file metadata, such as ownership, permissions, and dates, separately.

Differences in file transparency:

  • For rsnapshot, all versions of the backup are accessible as plain files.
  • For rdiff-backup, only the current backup is accessible as plain files. Previous versions are stored as rdiff deltas.

Differences in backup levels made:

  • rsnapshot supports multiple levels of backup such as monthly, weekly, and daily.
  • rdiff-backup can only delete snapshots earlier than a given date; it cannot delete snapshots in between two dates.

Differences in support community:

  • rdiff-backup has seen a lot of recent development and bugfixing activity. From December 2019 till spring 2020, rdiff-backup was re-worked into version 2, which supports Python 3.

Supported file systems:

  • rdiff-backup supports all unixoid file systems. FAT32, NTFS and HFS+ are supported too. As of today (July 2020), there are still problems with exFAT.
4

rsync Install rsync

If you're familiar with command-line tools, you can use rsync to create (incremental) backups automatically. It can mirror your directories to other machines. There are lot of scripts available on the net how to do it. Set it up as recurring task in your crontab. There is also a GUI frontend for rsync called Grsync that makes manual backups easier.

One very useful example is:

rsync -vahP --delete --backup-dir ../$(date --iso-8601=minutes) <source directory> <destination directory>

Among -vahP, the -a flag is important, as this preserves file permissions and recurses into subdirectories. --backup-dir stores changed and deleted files in the specified backup directory, which is conveniently named after the current date and time.

The idea below stores changed/deleted files with a suffix, which carries the current time/date:

rsync -vahP --delete --backup-dir ../backup --suffix .$(date --iso-8601=minutes) <source directory> <destination directory>

Though rsync is very fast and very versatile, only the last backup can be easily restored in an obvious way.

Another way to preserve deleted files would be using hard links.

See:

7

Duplicity Install Duplicity

Duplicity is a feature-rich command line backup tool.

Duplicity backs up directories by producing encrypted tar-format volumes and uploading them to a remote or local. It uses librsync to record incremental changes to files; gzip to compress them; and gpg to encrypt them.

Duplicity's command line can be intimidating, but there are many frontends to duplicity, from command line (duply), to GNOME (deja-dup), to KDE (time-drive).

3

Dropbox

A cross-platform (proprietary) cloud sync for Windows, Mac, and Linux. 2GB of online storage is free, with paid options. Advertised as a way to "store, sync, and, share files online" but could be used for backup purposes too.

Note that even on paid accounts revision history is limited to one year and on free accounts it is only one month.

Note also that restoring large amount of files may be very time-consuming as Dropbox was not built as a backup tool.

Dropbox in use on Ubuntu

4

luckyBackup Install LuckyBackup

It's not been mentioned before, so I'll pitch in that "LuckyBackup" is a superb GUI front end on rsync and makes taking simple or complex backups and clones a total breeze.

Note that this tool is no longer developed.

The all important screenshots are found here on their website with one shown below:

luckyBackup

Note: As of 2021-01, the last release of luckyBackup was on 2018-11

1

BackupPC Install BackupPC

If you want to back up your entire home network, I would recommend BackupPC running on an always-on server in your basement/closet/laundry room. From the backup server, it can connect via ssh, rsync, SMB, and other methods to any other computer (not just linux computers), and back up all of them to the server. It implements incremental storage by merging identical files via hardlinks, even if the identical files were backed up from separate computers.

BackupPC runs a web interface that you can use to customize it, including adding new computers to be backed up, initiating immediate backups, and most importantly, restoring single files or entire folders. If the BackupPC server has write permissions to the computer that you are restoring to, it can restore the files directly to where they were, which is really nice.

BackupPC Web Interface - Server Status Page

3

bup

A "highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images."

Highlights:

  • It uses a rolling checksum algorithm (similar to rsync) to split large files into chunks. The most useful result of this is you can backup huge virtual machine (VM) disk images, databases, and XML files incrementally, even though they're typically all in one huge file, and not use tons of disk space for multiple versions.
  • Data is "automagically" shared between incremental backups without having to know which backup is based on which other one - even if the backups are made from two different computers that don't even know about each other. You just tell bup to back stuff up, and it saves only the minimum amount of data needed.
  • Bup can use "par2" redundancy to recover corrupted backups even if your disk has undetected bad sectors.
  • You can mount your bup repository as a FUSE filesystem and access the content that way, and even export it over Samba.
3

CrashPlan

CrashPlan is a company providing business backup, without plan for individual users.

Features

  • 10$/month/device fee
  • Triple destination data storage and protection
  • Silent and continuous
  • Generous retention and versioning
  • Deleted file protection

I had considered a bunch of options and configurations (using rdiff-backup, duplicity, backup-ninja, amazon s3, remote server). What it finally came down to was simplicity.

CrashPlan is cross platform, but not open source.

It's also worth noting that with a (paid) CrashPlan Central 'family' plan you can backup all the computers you own.

2

Bacula

I used Bacula a long time ago. Although you would have to learn its architecture, it's a very powerful solution. It lets you do backups over a network and it's multi-platform. You can read here about all the cool things it has, and here about the GUI programs that you can use for it. I deployed it at my university. When I was looking for backup solutions I also came across Amanda.

One good thing about Bacula is that it uses its own implementation for the files it creates. This makes it independent from a native utility's particular implementation (e.g. tar, dump...).

When I used it there weren't any GUIs yet. Therefore, I can't say if the available ones are complete and easy to use.

Bacula is very modular at it's core. It consists of 3 configurable, stand-alone daemons:

  • file daemon (takes care of actually collecting files and their metadata cross-platform way)
  • storage daemon (take care of storing the data - let it be HDD, DVDs, tapes, etc.)
  • director daemon (takes care of scheduling backups and central configuration)

There is also SQL database involved for storing metadata about bacula and backups (support for Postgres, MySQL and sqlite.

bconsole binary is shipped with bacula and provides CLI interface for bacula administration.

3

tar

tar, a simple and reliable tool for archiving files, can also be used for backups. But today, we have better and faster backup tools with more useful features. Depending on your needs, tar can still be useful.

Create a full backup of your home directory:

cd to the directory where you want to store the backup file, and then:

tar --create --verbose --file backup.tar <path to the home directory>

For subsequent backups, we want to avoid a full backup - because it takes too much time. So we simply update the files in backup.tar:

Again, cd to the directory where the backup file is, and then use --update:

tar --update --verbose --file backup.tar <path to the home directory>

All files that are either new or have been modified will be saved in backup.tar. Deleted files will be kept. To restore the most recent backup, right-click on the file and choose "Extract to...". To retrieve older versions of your files, you have to open backup.tar, and find the files (and versions) you want to restore.

Note: You cannot use --update on a compressed tar file (e.g. .tar.gz).

Simple Backup Install Simple Backup

Note: As of 2021-01 the last release was on 2013.

Simple Backup is another tool to backup your file and keep a revision history. It is quite efficient (with full and incremental backups) and does not take up too much disk space for redundant data. So you can have historical revision of files à-la Time Machine (a feature Back in time - mentioned earlier - is also offering).

Features:

  • easy to set-up with already pre-defined backup strategies
  • external hard disk backup support
  • remote backup via SSH or FTP
  • revision history
  • clever auto-purging
  • easy sheduling
  • user- and/or system-level backups

alt text

As you can see the feature set is similar to the one offered by Back in time.

Simple Backup fits well in the Gnome and Ubuntu Desktop environment.

7

DAR Install DAR

DAR - the Disk ARchive program - is a powerful command line backup tool supporting incremental backups and restores. If you want to backup a lot of files then it may be considerable faster than rsync (rolling checksum) like solutions.

Attic Backup / Borg Backup

Note: As of 2021-01 the last release was on 2015.

Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup data. The data deduplication technique used makes Attic suitable for daily backups since only the changes are stored.

Main Features:

  • Easy to use
  • Space efficient storage: Variable block size deduplication is used to reduce the number of bytes stored by detecting redundant data.
  • Optional data encryption: All data can be protected using 256-bit AES encryption and data integrity and authenticity is verified using HMAC-SHA256.
  • Off-site backups: Attic can store data on any remote host accessible over SSH
  • Backups mountable as filesystems: Backup archives are mountable as userspace filesystems for easy backup verification and restores.

Requirements:

Attic requires Python >=3.2. Besides Python, Attic also requires msgpack-python and OpenSSL (>= 1.0.0). In order to mount archives as filesystems, llfuse is required.

Note:

There is also now a fork of Attic called Borg.

Spideroak

A dropbox like backup/syncing service with comparable features.

  • Access all your data in one de-duplicated location
  • Configurable multi-platform synchronization
  • Preserve all historical versions & deleted files
  • Share folders instantly in web
  • ShareRooms w / RSS
  • Retrieve files from any internet-connected device
  • Comprehensive 'zero-knowledge' data encryption

Listed supported systems: Debian Lenny, OpenSUSE, RPM-Based (Fedora, etc.), CentOS/RHEL, Ubuntu Lucid Lynx, Ubuntu Gutsy Gibbon, Ubuntu Karmic Koala, Ubuntu Maverick Meerkat, Ubuntu Intrepid Ibex, Debian Etch, Ubuntu Hardy Heron, Slackware 12.1, Ubuntu Jaunty Jackalope

More info at

4

FlyBack

Warning: Unmaintained, last update in 2010.

Similar to Back in Time

Apple's Time Machine is a great feature in their OS, and Linux has almost all of the required technology already built in to recreate it. This is a simple GUI to make it easy to use.

FlyBack v0.4.0

1

Jungledisk Pay for application

Is a winner as far as I'm concerned. It backs up remotely to an optionally-encrypted Amazon S3 bucket, it's customisable, it can run in the background (there are various guides available for setting that up). There's a decent UI or you can hack an XML file if you're feeling so inclined.

I backup all of my home machines with the same account, no problem. I also can remotely access my backed-up data via myjungledisk.com .

It's not free, but in US terms it's certainly cheap enough (I pay around $8 a month). I feel that's more than acceptable for an offsite backup where someone else deals with hardware and (physical) security etc issues.

I can't recommend it enough.

2

Areca Backup

Warning: Unmaintained, last release in 2015.

is also a very decent GPL program to make backups easily.

Features

  • Archives compression (Zip & Zip64 format)
  • Archives encryption (AES128 & AES256 encryption algorithms)
  • Storage on local hard drive, network drive, USB key, FTP / FTPs server (with implicit and explicit SSL / TLS)
  • Source file filters (by extension, subdirectory, regular expression, size, date, status, with AND/OR/NOT logical operators)
  • Incremental, differential and full backup support
  • Support for delta backup (store only modified parts of your files)
  • Archives merges : You can merge contiguous archives into one single archive to save storage space.
  • As of date recovery : Areca allows you to recover your archives (or single files) as of a specific date.
  • Transaction mechanism : All critical processes (such as backups or merges) are transactional. This guarantees your backups' integrity.
  • Backup reports : Areca generates backup reports that can be stored on your disk or sent by email.
  • Post backup scripts : Areca can launch shell scripts after backup.
  • Files permissions, symbolic links and named pipes can be stored and recovered. (Linux only)

I run a custom Python script which uses rsync to save my home folder (less trash etc) onto a folder labelled "current" on a separate backup HDD (connected by USB) and then the copy (cp) command to copy everything from "current" onto a date-time stamped folder also on the same HDD. The beautiful thing is that each snapshot has every file in your home folder as it was at that time and yet the HDD doesn't just fill up unnecessarily. Because most files never change, there is only ever one actual copy of those files on the HDD. Every other reference to it is a link. And if a newer version of a file is added to "current", then all the snapshots pointing to the older version are now automatically pointing to a single version of the original. Modern HDD file systems takes care of that by themselves. Although there are all sorts of refinements in the script, the main commands are simple. Here are a few of the key ingredients:

exclusion_path = "/home/.../exclusions.txt" # don't back up trash etc
media_path = "/media/... # a long path with the HDD details and the "current" folder
rsync -avv --progress --delete --exclude-from=exclusion_path /home/username/ media_path
current = "..." # the "current" folder on the HDD
dest = "..." # the timestamped folder on the HDD
cp -alv current dest

I had some custom needs as well. Because I have multiple massive (e.g. 60GB) VirtualBox disk images, I only ever wish to have one copy of those, not snapshot versions. Even a 1 or 2 TB HDD has limits.

Here are the contents of my exclusions file. The file is very sensitive to missing terminal slashes etc:

/.local/share/Trash/
/.thumbnails/
/.cache/
/Examples/
1

Duplicati

An open source, gratis backup application running on Linux, with gui that "securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more)".

Version 1.0 is considered stable; there is a version 2 in development with considerable internal changes that is currently working (though I wouldn't use it for production). There are standard or custom filter rules to select files to backup.

I have been using it for years partly (not connected to anyone there but have considered looking at the API to add a backend, speaking as a developer) although infrequently, on both a Windows laptop and my Ubuntu 14.04 install.

A fork of duplicity.

Dirvish

Note: As of 2021-01 the last release was in 2005.

A nice command line snapshot backup tool which uses hardlinks to reduce diskspace. It has a sophisticated way to purge expired backups.

1

BorgBackup is a CLI tool and with Vorta as its GUI does everything you need and more. There is even a PPA for BorgBackup itself.

The main difference between BorgBackup and any other backup solution is that it's a deduplicating backup solution:

E.G. if you have multiple copies of one single file, that file will take up space only once.

  1. Install BorgBackup:

    sudo add-apt-repository ppa:costamagnagianfranco/borgbackup
    sudo apt update
    sudo apt install borgbackup
  2. Install Vorta:

    pip install vorta
  3. Make your initial backup:

    borg init --encryption=repokey-blake2 /media/ExternalHDD/{user}
  4. click the Vorta icon to go to the GUI and configure it.

PING is a no-nonsense free backup tool that will let you make backups of entire partitions. It is a standalone utility that should be burnt on CD.

What I like about this program is that it copies the entire partition. Imagine this: while modifying your Ubuntu as a superuser, you changed a vital part and Ubuntu won't start up anymore.

You could format the hard disk and reinstall Ubuntu. While backup solutions as Dropbox, Ubuntu One etc. might be useful for retrieving the important files , it won't restore your wallpaper, Unity icons and other stuff that made your Ubuntu the way you liked it.

Another option is to ask for help on the internet. But why not just restore the whole system to the way it was a few days ago? PING will do exactly this for you.

Pro's:

  • Will not only backup documents, but system files as well
  • It's easy to use
  • It is possible to backup other (non-Linux) partitions as well
  • It will compress the backup in gzip or bzip2 format, saving disk space

Cons:

  • The PC will have to be restarted before being able to backup
  • PING will make a backup of an entire partition, even when only few files have been modified
  • You'll need an external hard drive or some free space on your PC to put your backups

An excellent Dutch manual can be found here.

s3ql is a more recent option for using Amazon s3, Google Storage or OpenStack Storage as a file system. It works on a variety of Linux distros as well as MacOS X.

Using it with rsync, you can get very efficient incremental offsite backups since it provides storage and bandwidth efficiency via block-level deduplication and compression. It also supports privacy via client-side encryption, and some other fancy things like copy-on-write, immutable trees and snapshotting.

See Comparison of S3QL and other S3 file systems for comparisons with PersistentFS, S3FS, S3FSLite, SubCloud, S3Backer and ElasticDrive.

I've been using it for a few days, starting from s3_backup.sh, (which uses rsync) and am quite happy. It is very well documented and seems like a solid project.

TimeVault

Warning: unmaintained

TimeVault a is tool to make snapshots of folders and comes with nautilus integration. Snapshots are protected from accidental deletion or modification since they are read-only by default.

Can be downloaded from Launchpad.

inosync

A Python script that offers a more-or-less real-time backup capability.

Mote that this software is not maintained anymore.

"I came across a reference to the “inotify” feature that is present in recent Linux kernels. Inotify monitors disk activity and, in particular, flags when files are written to disk or deleted. A little more searching located a package that combines inotify's file event monitoring with the rsync file synchronization utility in order to provide the real-time file backup capability that I was seeking. The software, named inosync, is actually a Python script, effectively provided as open-source code, by the author, Benedikt Böhm from Germany ()."

Obnam

Warning: Software is no longer maintained, authors recommend not using it

'Obnam is an easy, secure backup program. Backups can be stored on local hard disks, or online via the SSH SFTP protocol. The backup server, if used, does not require any special software, on top of SSH.

Some features that may interest you:

  • Snapshot backups. Every generation looks like a complete snapshot, so you don't need to care about full versus incremental backups, or rotate real or virtual tapes.
  • Data de-duplication, across files, and backup generations. If the backup repository already contains a particular chunk of data, it will be re-used, even if it was in another file in an older backup generation. This way, you don't need to worry about moving around large files, or modifying them.
  • Encrypted backups, using GnuPG.'

An old version can be found in the Ubuntu software sources, for the newest version refer to Chris Cormacks PPA or Obnams website.

saybackup and saypurge

There is a nice script called saybackup which allows you to do simple incremental backups using hardlinks. From the man page:

This script creates full or reverse incremental backups using the
rsync(1) command. Backup directory names contain the date and time
of each backup run to allow sorting and selective pruning. At the end of each successful backup run, a symlink '*-current' is updated to always point at the latest backup. To reduce remote file
transfers, the '-L' option can be used (possibly multiple times) to
specify existing local file trees from which files will be
hard-linked into the backup.

The corresponding script saypurge provides a clever way to purge old backups. From the home page of the tool:

Sayepurge parses the timestamps from the names of this set of backup directories, computes the time deltas, and determines good deletion candidates so that backups are spaced out over time most evenly. The exact behavior can be tuned by specifying the number of recent files to guard against deletion (-g), the number of historic backups to keep around (-k) and the maximum number of deletions for any given run (-d). In the above set of files, the two backups from 2011-07-07 are only 6h apart, so they make good purging candidates...

backup2l

Warning: unmaintained, last commit on 2017-02-14

From the homepage:

backup2l is a lightweight command line tool for generating, maintaining and restoring backups on a mountable file system (e. g. hard disk). The main design goals are are low maintenance effort, efficiency, transparency and robustness. In a default installation, backups are created autonomously by a cron script.

backup2l supports hierarchical differential backups with a user-specified number of levels and backups per level. With this scheme, the total number of archives that have to be stored only increases logarithmically with the number of differential backups since the last full backup. Hence, small incremental backups can be generated at short intervals while time- and space-consuming full backups are only sparsely needed.

The restore function allows to easily restore the state of the file system or arbitrary directories/files of previous points in time. The ownership and permission attributes of files and directories are correctly restored.

An integrated split-and-collect function allows to comfortably transfer all or selected archives to a set of CDs or other removable media.

All control files are stored together with the archives on the backup device, and their contents are mostly self-explaining. Hence, in the case of an emergency, a user does not only have to rely on the restore functionality of backup2l, but can - if necessary - browse the files and extract archives manually.

For deciding whether a file is new or modified, backup2l looks at its name, modification time, size, ownership and permissions. Unlike other backup tools, the i-node is not considered in order to avoid problems with non-Unix file systems like FAT32.

12