I’m looking for a backup strategy and software in order to backup both my files and system files.
What I need:
1- Specific cloud platform that offers cloud storage for backups AND offers a multi-OS client. (E.g. I was using Google drive but they don’t offer a Linux Client)
2- A multi-OS backup client that allows file extension and folder exclusion (Mainly to exclude “Node_Modules”)
3- A multi-OS encryption software that encrypts my data before it sends them to the cloud (I hear cryptomator does this?)
Re. 1: I have used Amazon S3 and Glacier, but have switched over to Wasabi. The first terabyte is at a fixed price, and after that at a proportional cost ($.0059 per GB/month ($5.99 per TB/month) US$); there are no upload, download or access costs. It is an S3 compatible service and will work with most clients that support S3. They have data centers in the US and Europe.
If you search through the restic forum you will find posts on people who have created gui interfaces, but I haven’t used any of them. I am a linux only user so I use bash. It’s not difficult to configure. Like here, the restic community is extremely helpful. The developers are also active in the forum.
Also, you can use rclone as a backend which opens up many cloud services.
An old timer in my local Linux User Group is a big proponent of going “old school” when it comes to backups…
Using standard *NIX commands which haven’t ever changed likely means they won’t change in the future so if they worked 20 years ago and still do today, they probably will 20 years from now.
No worry about backup application versions. With some backup software, you need to install the same version software to restore a backup or the backup may not be recognized.
You have full control over what happens. The individual commands are usually simple to understand so even a beginner can read a script and understand each step.
Numerous scripts are posted publicly that implement different features. And, they probably all work.
Hi
At the end of the day, backup can be done any number of ways, the real question is what is the restore strategy you are after and also data integrity for the backup.
Personally, I have zero interest in the operating system, I only look at configs and backup of the disk partition information (sgdisk and efibootmgr [info only]), then can reload the partition information to recreate the disk, one operation…
The second question is when/how do you want to access the saved data, if historical, maybe better left on the backup mediums, note plural, keep something offsite
Then, test, test, test… put in a fresh drive and go through your restore process… how long does it take, did it work, if not refine and test again.
I use small system disks, it forces me to clean up and archive off data I don’t really need to backup disks
All I use these days is tar and cp so can be done without any gui if needed, some things are also scripted as well.
**erlangen:~ #** btrfs filesystem show
Label: 'TW-20200515' uuid: e7ad401f-4f60-42ff-a07e-f54372bc1dbc
Total devices 1 FS bytes used 20.69GiB
devid 1 size 51.69GiB used 30.05GiB path /dev/nvme0n1p2
Label: 'Tumbleweed' uuid: 204f7d0f-979a-41e1-a483-a597d0357e0b
Total devices 1 FS bytes used 25.21GiB
devid 2 size 60.00GiB used 29.03GiB path /dev/sdc5
Label: 'Leap-15.2' uuid: 69774d55-8da2-4599-9c27-766b1012771d
Total devices 1 FS bytes used 15.99GiB
devid 1 size 28.13GiB used 17.30GiB path /dev/sdc8
**erlangen:~ #**
Currently ‘TW-20200515’ is active. I cloned it from ‘Tumbleweed’ some time ago using rsync and reinstalling grub. That procedure worked without annoyances. Cloning the system and running the clone without encountering any problems is the ultimate verification of your backup.
I did full backups of /home to external disks once a few weeks for several years. With the HDDs decommissioning I added a daily rsync to HDD: