Best backup application to use

Just want to see what is out there that might be a good solution for backing up my /home. Background: After an issue/bug/failure and having done manual backups (by coping files to NAS), I’m looking for an automated solution.

Long story shortened - I deleted a MP3 through Amarok and every file on my HD in /home was deleted. Folders were still there, but no files. I only lost a few things, but most of it was already backed up manually. Good news - finally updated to Leap 15.1 lol!

Does anyone have any recommendations for an automated backup solution?


Have a look at ‘luckybackup’. It’s rsync based, and can handle ‘snapshots’ ( not the btrfs ones ), i.e. keep previous versions. Some friends use it to backup to their NAS.

I know a few people, both business and home who are now simply running TAR scripts.
TAR (Tape Archive) is today used for many more backup scenarios than tape machines…
Typically the scripts can be very short, easy to read, easy to understand and so are are easy to modify.
And, incremental backups are supported which can be a big space saver if you’re making large numbers and frequent backups.

Probably the biggest advantage of backups created by TAR is that you can create and restore using practically any version of Linux for the past 30 years, TAR is always included in every distro (even other OS) and hasn’t really changed across all that time. In other words, you can easily restore a TAR archive made 10 years ago without worrying about the version of backup app or what OS and version the archive was made from.

There are plenty of scripts to choose from, run a few and see what they do…


Knurpht, TSU - thanks for the recommendations. I’ll read up both of them and check them out.

I just rsync to an external usb drive, and some things I spread across multiple machines with syncthing, though I may go back to using nextcloud. This seemed simpler. There are a number of backup tools, but many use rsync as an underlying driver already, while some also do hardlinks to stage backups with time dated snapshots, but I am not interested in dated snapshot recovery. I only suggest do not backup onto the same drive ;).

Some years ago, when using a different distro, I also used luckybackup but somehow things didn’t go as I was used to when using Grsync. Luckybackup did something different and I didn’t like that. Unfortunately it is too long ago to remember what it was. I think it had something to do with permissions which were set to root instead of to me.

Grsync is a simple to use GUI for the powerful rsync command line package. It simply copies everything you tell it to to your backup disk (or folder, or where ever you want it). It is a one on one backup, meaning the backup will have the same folder structure as on the original disk. This makes it easy to restore a file when needed. Just open the file-manager, browse to the correct path and copy the file you need to its original position. When using a TAR, as Tsu2 suggested, this will be more difficult. A TAR backup will probably be less big, which could be an advantage.
I have used Grsync for many years and it is still a very powerful program which simply does what it needs to do: create backups.


  • The 1st RAID system I saw was an IBM Mainframe in the early 1980s – it had a “Disk Farm” of washing machine size drives but, no magnetic tapes and, no removable disk packs …
  • Doesn’t fix the “accidental deletion” issue though …

To handle “accidental deletion” you’ll need a separate disk which, assuming that, it’s permanently attached to the system and, that it’s permanently powered on, can be used by the other suggestions in this post.

  • For sensitive content, I recommend a DVD-RAM which is attached to the system only when the backups are being executed …

As @dcurtisfra already hints at, you should first make a backup policy. After that you can try to find the technical solution to implement that policy.

The first question is for what sort of accidents/disasters you want to do this. A few thinking suggestions (a few were already mentioned above):

  • accidental loss of a file;
  • loss of a file system;
  • loss of a disk;
  • loss of a system (burned out);
  • loss of the whole location (burned down house, including backup system/set of off-line mass-storage backup media).

Other question, specialy when you offer backups of user files to your users: backup of only the last version, or of the last x versions and in what frequency? In combination this may lead to having seven backups made the last seven nights and 52 backups kept from every saturday night and maybe one for every year made on 31 December for your tax adnministration. In short, it is only your fantasy, but also only your responsibility that counts.

Nice contemplating :slight_smile:


  • accidental loss of a file;

This is a rather delicate issue:

  • For ext3/ext4 Filesystems, there’s a package named “extundelete”.
  • For Btrfs you may, possibly, be saved by “Snapper” …
  • For everything else, from a CLI viewpoint, “rm” can’t be undone …
  • From a GUI view, >99% of the current GUIs have a “Waste bin” or “Rubbish bin” …

I think in this case “accidental loss of a file” does not include a file being moved to the Trash bin of a GUI.
But I think in this case it does include a file that is changed (maybe beyond recognition) by some tool (editor, LibreOffice) and written back. The case where the user cries: “O, now I have lost three chaptesr of my book!” The original will not be in the Trash bin, and will not be recoverable by some file system “undelete” tool. It will probably only be possible to get it back from yeterdays backup.

Thought I’d pop in here and describe what I do. For my laptops and my home directories, I use borg. It’s in the repos. I prefer this because it supports encryption (save your key), compression, and deduplication. I can also fuse mount the backups and pull files out.
I have a script, copied from their site, that runs from my cron.daily. It handles rotation and removes older backups.

Alot of my other stuff is stored on a “data” drive that is shared out to other home users. They backup their own computers to this share and I have picutres, music, videos, etc…
I back this up to a 2nd internal drive. I have a script that mounts the 2nd drive and and btrfsync, then takes a snapshot and unmounts the 2nd drive. This gives me incrementals and is relatively safe. A 2nd script grabs the most important stuff and ships it off to a cloud server.