Backup in linux?

Dear all
I have one sata disk of 500GB that I want to use for backing up files from my
a. windows system
b. linux system

I would like to know if there is

  1. any program that can periodically do backups into some special compressed file (that can restore everything back after restoring it)

2a) any special filesystem that compresses everything when data is written and 2b) how well this works with files from my ntfs partition

I would like to thank you in advance for you help

Regards
Alex

You can use rsync with compression turned on and then set up a cron job using rsync because after the first time, you can set up rsync only to copy differences.

Use yast to make backup.

Let’s say that I want to backup my /home directory and save it to a file called mybackuptest.ext (.ext== your extension). Can you show me how to make rsync store only the diferences between the previous snapshots? Will overwrite the old mybackuptest.ext with a new one so I will only have one file to save.

Or is the case that I have one file with many ‘patches’ to apply to get to the most recent version?
Regards
Alex

Could you please comment a little bit about the process? I made a new profile and I have to select which folders to include. Which one do you suggest me for?

Read this SDB:Home backup - openSUSE.

I use with big satisfactio luckybackup, easy, very configurable very good :slight_smile:

I use dar to backup linux partitions. It will do full and incremental backups with compression. It does not backup the MBR or the partition structure. I always label the disks so saving the output of fdisk -l gives me the partition layout. Dar will restore to partitions that are smaller than the original and with a different file system. You need to make sure that you have a bootable cd or dvd that can partition a disk if needed, create and format the partitions and install grub. If you are careful you can even use the system while dar doing its backup. Using a web browser or listening to music should not affect your backup.

For windows I use windows based backup software. For XP I use ntbackup to a samba share followed by gzip for compression. I am still looking for a good solution for windows 7. I am currently cloning windows 7 partitions but I don’t like this as it is hardware dependent and the backups (partition images) are not that easy to test. If I have a problem with a disk on a Windows 7 box I may want to replace it with a disk with 4k sectors and I am not sure that that is compatible with a image of a partition on a 512k sector disk.

What ever you do you need to test the restore process. This can be difficult and dangerous if you only have 1 computer with 1 hard disk

On 2011-02-25 12:36, alaios wrote:

> Let’s say that I want to backup my /home directory and save it to a
> file called mybackuptest.ext (.ext== your extension). Can you show me
> how to make rsync store only the diferences between the previous

No, rsync does not save an archive. It saves the original file extructure,
file by file, one by one.

backup tools:

amanda
dar
rdiff-backup current copy is mirror, older are rdifs
rsnapshot current copy is mirror, older are hardlinksl
http://www.dirvish.org/
pdumpfs (http://0xcc.net/pdumpfs)


Cheers / Saludos,

Carlos E. R.
(from 11.2 x86_64 “Emerald” at Telcontar)

I use rsnapshot for backup.

Back-In-Time is a nice (rsync-based) GUI utility that I’ve used in the past:

Back In Time

I liked it because it was simple to configure and backup manually (or automatically).

Thank you very much all for your contributions

I have a few more questions to ask:

what is the best way to save up space when I compress my files?

I would like to thank you in advance for your help
Regards
Alex

I am trying to find a guide for Yast’s module that is called Backup.
I created a test profile to see how things work but it seems that is backing up my entire system .

I was mostly intrested in backing up only /home, /etc, /root (what else contains system configs?) but I could not find how to exclude files .

best Regards
Alex

One more question.
What are the limitations and drawbacks for the simple tar and gz compression?

I was thinking to try something like:

tar -zcvpf /media/DWORKIN/ubuntu_backup-date '+%d-%B-%Y'.tar.gz/ /media/harsdisk/storeme.tar.gz

Even though my backup is like 150GB will I be able to read its contents when using gui that read compressed files?
Of course I will not have any differential backups but it seems that I can afford having 2-3 backups stored which still is okay for me.

What might not work well with tar and gz compression?

Regards
Alex

I executed the command below:
tar -zcvf /media/disk/apa-caracusdate '+%d-%B-%Y'.tar.gz /etc /root /home

which after 5 hours returned the following message:
tar: Exiting with failure status due to previous errors

How can I have verbose mode to check what happened?

This description fits dar or duplicity, among others. I backup with duplicity and end up with a series of archive files each 650MB in size (encrypted, but that’s optional) that are then easy to upload to an online storage service or copy onto a USB stick or disk.

Duplicity is very good with differentials - it only backs up the data that has changed between backups. Dar is slightly worse in that it backs up all the files that have changed between backups completely. Both programs support compression transparently so you don’t need a compressed filesystem or anything else.

If you’re not so comfortable with the command line I’d try the GUI options people suggested though - YAST or that Timeline one (whatever it’s called).

These are the errors my 2>> captured

tar: Removing leading /' from member names tar: Removing leading /’ from member names
tar: Removing leading `/’ from member names
tar: /home/ap/.gvfs: Cannot stat: Permission denied
tar: /home/ap: file changed as we read it

I do not think are so crucial. Are they?

I will reply to the rest later on again

I tried many solutions over the years, and quite frankly, the simplest one is the best (IMHO).
Large external disks are quite cheap, so I have one of 500 GB.

I do backup with following command (put it in script): rsync -av --delete /home /media/transcend/backup/home/

So, I just plug in the drive and execute command. Something like “shoot & forget” system.
1st time took a while, but now, generally backup is over in a few seconds and files are in sync guaranteed.

This is tar doing you a favor, it removes the leading / from the archive so when you extract the tarball you won’t overwrite files off of the root.

If you tar up /etc/passwd it is archived as etc/passwd. So if you where to extract your tarball in the /tmp directory you would get /tmp/etc/passwd.

You don’t have permission to read /home/ap/.gvfs

/home/ap changed during the creation of the tarball.

Good luck,
Hiatt

Expanding on what Hiatt said, the /home/<yourname>/.gvfs is some bizarre virtual filesystem thing used by GNOME and GTK applications. Any errors about not being able to back it up can safely be ignored.