I am working on a bash shell scripts that backups up the users $HOME directory, backup-light - Google Code, one thing that I want to do with it but not sure how to go about doing is making is so that every time it backups up it will log the date and backup type in a file in the users home directory, this I can do easily. The part that I don’t know how to do is make it so when the user calls up a certain function, somewhat like a recovery mode, it will read this text file for the latest full backup and use that as a starting point. Then it will ask the user at for the date that they want to stop at, assuming that there was a backup done on that date. Is this possible, if so how can it be done?
No offense, but why reinvent the wheel? There are tons of things to do this… if not something heavy-weight like Bacula or AMANDA, what about just using rsync or something?
To tell the truth I have never had any luck with any of the heavy-weight backup solutions like Bacula or AMANDA. I did not try rsync, I know it as a backup ability, but it doesn’t do everything I want a backup solution to do. Which is why I am trying to reinvent the wheel, so I can have something small that will do everything that I want.
I can only hope someone else will chime in. I’m sure there’s an existing lightweight solution to do what you want.
If there isn’t, can you explain a little more about the logic that you want for this program that the user runs to restore? Are you going to keep multiple versions of the backup (on whatever your backup media/storage is)? Multiple types?
When I have to backup something this simple, I usually just use rsync to backup a whole directory (recursively) to a remote host, and then if I need to restore it, I just scp -R the whole directory from the remote machine back to the client.
As to network backup, I’m sold on Bacula. My installation currently backs up 12 hosts, plus one legacy machine over NFS to another host, and 2 of the 12 are remote and done over VPN. It’s wonderful - pretty much fully automated, I get nightly differentials and weekly full backups to tape. And if there’s a problem, Nagios tells me about it.
One of the systems that I run is a multiuser system, and I can have all the backups put in the same directory and still be able to tell for which user and when the backup was made just by looking at the filename, I’m not sure that rsync can do that without doing a different configuration for each user. Also if I wa to save a backup to a DVD or CD then I can easily do so without having to change any of the scripts settings.
I guess a better way to explain what I want this restore feature to do is if the the hdd fails in a computer when they install a new one then installed of having to restore all the backups (a full backup and any incremental backups done after that) by hand, the program can read from this “log” determine the latest full backup done restore that and any incremental backups done after that to the point that the user specifies.
The “log” file will have the date and type of the backup like this
So it would run like this
* List all dates there are backups for
* Get end date from the user (from the dates listed in the last step)
* Read the "log" and determine the latest full backup
* Install any and all backups between the full backup and the user specified backup
Bacula will do all of that for you (once configured) automagically…
Just a hunch… but are you backing up to a different disk on the same machine? I know I may not be on the cutting edge, but I always used DVD/CD for single-user (manual) desktop backups… anything with more than one user usually goes to disk and tape…
rsync would work fine. If you want everything in different directories, each user can (via cron) execute a script that dumps the backups in their homedir on the remote machine. Or, if you want system-wide, the script can just read a list of users, cd to their homedir, and dump it into a directory named according to their username…
Well I don’t do network backups and don’t have any tape drivers so AMANDA and Baclua is just over kill. My machines backup up to different drives, my laptop has a HDD in a external enclosure that it uses and my desktop backs up to a second hard drive with in it. I also want all the backups to be stored in a single directory, that way I don’t have to go through and set up rsync, or any other program, to save the backup in a different directory. I see that as added steps, with my script all I have to do is place it somewhere where it can be ran, /usr/bin/ for an example, then set cron jobs to run that script that is all I have to do with that.
I’m not wanting to seem arrogant or anything, its just that I would much like to improve my backup script rather than give up on it after all the time I spent just to get it to work.
What about dar?
If you are asking if I tried dar, yes it was a year or so ago and all it would do is spit out an empty archive and that pissed me off. Which is why I started my backup script, so I could have something that I know will work, it does work, something that does only what I want it to do, it can do full backups, incremental backups, it is able to restore from backups that it creates, and burn backups to CDs and DVDs. I figured that it would be cool to have one advanced feature, the restore system that I am asking about here. If know one knows if this can be done with bash or how to do it then thanks for your time, I’ll just go with what I have. I don’t want this to been seen as a request for a backup solution and moved as it is not it is a true programming question.
Again thanks for your time.
I see your perfecly fine with your backup script. For all the others out
there: try rdiff-backup. It just rocks. Use the latest “unstable”.
I’m using it to backup ~650 Backup paths on ~60 hosts for one year now
and it never failed. It does full and incremental backups, stores an
uncompressed version of the current backup path like rsync but keeps an
compressed version of the diff of every file. It can tell you when each
file has changed and restore every timestamp you want. Does nice