A opensuse 15.1 server with some tens of users.
I want to make a incremental backup of user data to a externa share (in a raspberry pi with a external disk) by a cron script every night.
I’m not sure what will be the best way to do it
*Which program to use?
I’m thinking in storeBackup, is simple, although is a bit outdated
*Which remote filesystem?
I have formated the external disk in the raspberry as ext4, but now I have to share it and mount from the server to do the backup
I use rsync. Remote, thus rsyncd and rsync.
Only what is changed is copied, thus after the first complete backup, backups do not take much time.
Several instances (in my situations 10) are kept. Files that do not change are only once on the backup system, the instances then have hard links in each of them.
Ok, I guess using hard links as is explained here but without removing the latest link.
If I create a FIRST backup it will contain a complete copy of the source …
and then if I create a SECOND backup using --link-dest=FIRST it will contain a copy of all new files and hard links to the unchanged files in the FIRST one, true?
and then a THIRD backup linking to the SECOND and so on will keep N backups, but only the new files use storage, all unchanged files will be “backed up” only as hard links, right?
I got my idea from https://rsnapshot.org/
Rsnapshot is written in bash, thus you can install it (it is in the openSUSE OSS repo as package snapshot) and study what it does very easy.
I wrote my own the script based on it. Below pseudo code of the essential actions.
Basically I have 10 instances in 10 directories backup0 - backup9.
When making a new backup:
remove the last one
move all up one number
mv backup8 backup9 ; ..... ; mv backup1 backup2
create a new backup1
rm-rf backup1 ; cp -al backup0 backup1
backup with rsync to backup0
A lot of variations possible. Rsnapshot offers a wrapper that make it easy to configure around this concept with e.g. an hourly backup of 24 instances and a daily loop around it, etc., etc.