What backup programs/plan do you have?

I recently had a HD issue such that I finally respect the idea of a regular backup plan. So I’m looking for recommendations for both good backup programs as well as plans. Something that hopefully will let me setup:

  1. Regular backup of:
    [LIST=1]
  2. data
  3. application settings (maybe a backup program lets me run commands/batches?)
  4. packages/repos installed
  5. anything else important?

[/LIST]

I divide my backup system in three parts: 1st - /-partition, 2nd - config-files in /home (including my mail-folders, Firefox-settings etc.), 3rd - random data such as music, videos and pictures.

  1. For the /-partition I use Clonezilla, which is very reliable, easy to use and very fast (the 64bit version clones my 15GB /-partition in less than 5 minutes). I do so about twice a week, every clone is saved on a second internal HD and on an external HD, I always keep the last three clones.

  2. The hidden files and folders in /home are backupped via rsync, also because it is pretty fast (incremental backup). In case someone doesn’t get comfy with the command line rsync, I recommend →luckyBackup, which features pretty much any option of rsync plus lets you save settings in several profiles. I use rsync about two or three times a week and keep the last three syncs (also on the two different locations as above).

  3. The rest (videos, music etc.) is burned on CDs or DVDs whenever my HD becomes full.

While it is always bad to loose data, I think that losing systemdata is the most painful. It would take ages to set up my system from scratch.

Thanks. One question: how do you schedule those things to run? Do you do them in cron or in the programs or just do it manually?

Just manually when I have the time - Clonezilla has to be started via a live-system anyway. However, it’s pretty easy to set up an rsync-cronjob (once you’ve got the rsync-options straight).

I use Back in Time available in the repos. It’s basically a front end to rsync and allows you to take snapshots at predefined intervals and is easy to use. It uses hardlinks to files that haven’t changed so minimises the space used for the backup. Works well and have been using it for over a year now on all my systems.

I’ll look at luckybackup too out of interest.

hi,

from my personal experience Clonezilla is a monster in the backup-recovery field, it is like the old Ghost from s***ntec. -> my favorite!

i use this since a couple of years ago and it is really a great project in my point of view, che it out: flyback - Project Hosting on Google Code

rsync of course, major tool.

ill check that back in time though…

My strategy is doing monthly (and just before every major changing of system config) image backup with free version of Macrium and nightly incremental backup of my music, photos etc with Handy Backup. 1st helps me to restore my system quickly, 2nd - to keep my important files safe. The best storage is RAID5 NAS imho, but if you don’t have it, simple external 2TB HDD is Ok.

A lot depends on how much data you have to backup and how many changes you make in /. If you make few changes in /, then the only thing you may need to look out for is backing up mysql which I do by using mysqldump to make a backup in /home and then backing up /home which I do weekly or fortnightly depending on how much work I have done recently. This, of course, automatically backs up all my settings, emails etc. As I have less than a DVD’s worth of data in /home, a single DVD suffices and I use the child, parent, grandparent system.

Because a backup of / often takes longer to restore than a fresh install, I no longer bother to backup /. I just remember my configuration and do a fresh install. After problems with KDE4.5.1, I had / restored with KDE4.4.4 within an hour or so including all my mysql data but that was the first time in four years that I had done that.

If you have a fast upload check out Spideroak. It’s very useful as it will automatically sync your files. You will need a fast upload though.

Thanks but I don’t trust anyone else with my data.

rsync, automated by cron. Backup of /home, /var, /etc, /srv to internal disk (for quick access), and to USB-disk (2, one for every other week).

Plans:

  • One backup in my server (has all data and configs)
  • One next to my server
  • One at the neighbours
  • One at the other side of town
  • One at some relative in Canada
  • and one in orbit

So, if you spot me in orbit, you will know something has gone terribly wrong :wink:

A restore with Clonezilla is definitely much faster than a reinstall, plus it saves you from having to configure the system from scratch again.

If you want online storage, there is CrashPlanwhich includes a Linux client.

Handy thing with them is that they allow backing up not only to the online site (encrypted, compressed, yada yada yada) but to other local computers too. So if you have a server, or even just 2 desktops you can have them backup to each other AND to online. Could even put CrashPlan on a headless server and have everybody backup onto that one for local.

Quick restores from the local copies, or for catastrophic events (house burns down) you have the online version safely tucked away.

I have to admit, though, I haven’t tried it yet.

I also had a person in the computer club who “tested” his online backup by blowing away all 160 GB of data on his system (after backing it all up online of course) and he said it took him 15 days to restore it all (it worked, just took a looong time). For this reason, having a local copy sounds worth it!

@OP: I keep daily backups for seven days. This is fully automatic triggered by cron. All you need is 2 small bash scripts calling rsync. For a full description see Can you recommend Backup software post#8.

Only wimps use … backup: real men just upload their important stuff
on ftp, and let the rest of the world mirror it :wink:

Linus Torvalds (1996-07-20)
V2.0.8 - linux.dev.kernel | Google Groups

:slight_smile:

real men just upload their important stuff on ftp, and let the rest of the world mirror it

This may be a valid strategy for websites (I did restore a lost website from the wayback machine a long time ago) but not for my confidential letters :open_mouth:

I use a daily cron rsync to a separate hard drive (timed for meal-times - machine is on, but idle). I prefer a periodic-rsync over an immediate-mirror because a periodic-rsync allows me to recover from any mistakes I make during the day. I also have rsync preserve anything it over-writes in folders named by date-time - which is almost as good as a Back-in-Time like duplication, but less intensive on IO. The essentials are:

target="/usr/local"
location=/mnt/backup/userfiles
datestamp=date +%Y%m%d-%H%M%S
past=“past-$datestamp”

nice rsync -ax --delete -b -HA -X --sparse --backup-dir=$location/$past --stats $target $location

The script recycles the oldest $location/$past folders based on a space threshold.

Once a week I run the same script but target an encrypted volume on an external USB drive. The external drives are normally stored off site. Rsync has the advantage of minimising updates to the usb2-drive - which is quite slow.

I also use rsync to duplicate the operating system to a second root volume on the backup drive - this has the added advantage of peace of mind when upgrading - with a few edits I can make the second copy bootable and get back to a stable state from which to sort out a way forward. In recent times I’ve started keeping three OS sized partitions available - that way I can easily experiment.

Now that drives are inexpensive it’s quite easy to obtain peace of mind and flexibility for backups and upgrades.

Just to put in my tokens.
I don’t back up at all. Why? To time consuming for the stuff i have. I rather would say, if you like me, to just make a copy of your /home folder once so often. Other stuff like movies, pictures etc… you may secure on a regular basis on a separate harddrive or a dvd/blue-ray (or any other external media). But keep in mind that medias don’t last forever, so even if you made a copy you need to make a copy of that over a long period of time again. Unless stored in a dark spot with good temperatures.
Today, i think, the easiest way to back up is to pop in an external hd and store in your drawer.
But thats just my 2 cents.

I haved ZenOK Online Backup software I downloaded the ZenOK Free Antivirus and then I upgraded and paid for the Backup service, so far I’ve had great results with it. I safed my important files with it.

I have a system with 9 file systems (disk partitions) spread over 3 physical disks. I use dar to backup all file systems except /tmp & /swap. Some directories are also excluded such as /proc. I have a dar backup per file system. I use a mixture of full and incremental backups and the backups cycle around some external disks. e-sata is much better than USB. I set the system up with some file systems in lvm so that I could use a snapshot backup but I have not found that this is required in my environment. All disk volumes are labeled and are mounted by label.

With dar you can restore an individual file or a complete file system and the system can be at least lightly used while the backup is running. Be careful if there are files that are always being updated during every backup run. You need to be able to recreate the partitions if you have to restore a complete physical disk so I save the partition layout to the external disk by using fdisk.

I have completely restored the boot disk ( / + /srv + another file system) from the dar backup. This requires something like the opensuse live CD, preferably the same version as the dead system. In theory there should be a stand alone version of dar for the restore. for some reason this is not shipped in opensuse. However you only need the dar executable and its associated library that matches the live CD. I am about to build a recovery live cd or DVD which includes dar. After you have restored the data from dar you will have to install grub into the MBR or wherever it was. If you have replaced the disk you may have to edit menu.lst but if menu.lst is also using mount by volume label then this may not be necessary. When I did the grub install, grub rebuilt device.map