Basic BASH FTP script

You’ll have to forgive me, but I know basically nothing about bash scripting.

I’m setting up a PC for my mother, and I need to automate a basic backup system for her since she isn’t going to back up her files herself.

Can someone help me with a basic script to backup her:

/home/cheri/Desktop/
/home/cheri/Pictures/
/home/cheri/Documents/

folders? I have bzip2, gzip and p7zip installed. I assume p7zip is going to give the best compression.

If I can compress these folders, and then ftp them that would be great. I’m about to change webhosts because my current one is getting unbearably slow, but if you can help with the script and just put foo in various places, I can fill in the details for the ftp server, login, password, etc.

Thanks!

First of all, don’t use ftp, unless it’s just within a LAN, use scp.

And anyway if you use scp, you can specify the login, etc on the command line. You can even use key authentication and avoid sending passwords.

I’m not familiar with scp but if I can use it to get the files off her box, that would be great. My main concern is that she isn’t willing to backup to DVD or anything, yet somehow it is my issue when her HDD dies. Her files aren’t necessary anything I’m overtly worried about security with. I assume the password would have to be in the script to be fully automated via cron. An encrypted password file is fine, but again I’m not overtly paranoid here. The script would reside on her PC and just transfer files to an FTP server.

enderandrew wrote:
> I’m setting up a PC for my mother, and I need to automate a basic
> backup system for her since she isn’t going to back up her files
> herself.

Backup should always be somewhat automatic.

> Can someone help me with a basic script to backup her:
>
> /home/cheri/Desktop/
> /home/cheri/Pictures/
> /home/cheri/Documents/
>
> folders? I have bzip2, gzip and p7zip installed. I assume p7zip is
> going to give the best compression.

Why not do a backup of /home? Also backup is the second most importand
part. The most importand part is restoring.

> If I can compress these folders, and then ftp them that would be great.
> I’m about to change webhosts because my current one is getting
> unbearably slow, but if you can help with the script and just put foo
> in various places, I can fill in the details for the ftp server, login,
> password, etc.

I use http://en.opensuse.org/StoreBackup
Advantage is the restoring stuff is as easy as copying a file. Another
advantage is incremential backups.

Another advantage, I find, with using an already existing script is that
you won’t forget anything.

Otherwise the program easiest to use for ftp in a script would be
ncftp.

houghi

This was written under the influence of the following:
| Artist : Anouk
| Song : Help
| Album : Hotel New York

Another program that does pretty much all you want is curl. You can store the passwords in a .netrc file, just like ncftp.

here you have a little simple script you need
install ncftp and change the values for ftp server


#!/bin/bash

# change the username, password, ftp host and remote direcotry
ftp="/usr/bin/ncftpput -u username -p password ftp.host.com /pub/backups/"

# here you can wirte the list of directories you want to backup
DIRS="/home/cheri/Desktop /home/cheri/Pictures /home/cheri/Documents"

DATE=$(/bin/date +%Y%m%d)
/bin/tar czf backup.$DATE.tgz $DIRS 2> /dev/null
/usr/bin/md5sum backup.$DATE.tgz > backup.$DATE.MD5SUM
$ftp backup.$DATE.tgz backup.$DATE.MD5SUM
exit 0

So many possibilities . . .

I’m not entirely clear on where you want to script to run, the client or the server? Assuming the former, it is an upload; can you configure the server as needed? (If the latter, it is trivial to set up vsftp in YaST on the client.)

If you do have control of the server, rsync is a very good alternative. It will only transfer new or changed files, whereas ftp or a copy method will re-write all the data every time. And rsync is faster, too. So the backup time will be considerably less. Rsync can write to the ftp server directory so that the files can be later accessed that way, if desired. (There is also an rsync derivative which enables incremental backups, but it appears that you only want a single snapshot of the files.) Rsync can run from either client or server, logging on to the other machine. But it’s preferred that the receiving machine have the rsync daemon running; that gives you better control and is more secure.

If interested in this approach, reply back and we can provide some pointers.

I don’t run my own server. I pay for a webhost, which in turn gives me a FTP server, or I’d totally go with something like rsync.

Thanks everyone!

lftp may be what you’re after.

From man

lftp has builtin mirror which can download or update a whole directory tree. There is also reverse mirror (mirror -R) which uploads or updates a directory tree on server.

Webpin

To create an md5sum wouldn’t you need to use “create_md5sums” instead of “md5sum”?

This from the man page for md5sum:

Print or check MD5 (128-bit) checksums.

So it is the right program to use. If you want to check a md5sum you would use the -c option.

I guess the name is misleading “create_md5sums --help” has this to say:

Creates MD5SUMS files in each subdirectory.

You don’t really need to recursively create checksums; you’re assuming that your backup is good, and you’re just MD5-ing that. :slight_smile:

Not that all of the other responses aren’t good solutions, I think the best solution is rsync.

rsync

Prereqs: You must have rsync installed on the localsystem and have ssh access on the remote system (scp).

My first rsync example will be simple. If you put this in a script, you will get a very simple result. All of the files in the local directory will be copied to the remote directory. Then, each consecutive time you run the command it will update the files on the remote side.

/usr/bin/rsync -ru /path/to/dir/ remote:/path/to/dir

The next example will add preserving all datestamps, timestamps, permissions, uids/gids, etc (you need to be root for this though).

/usr/bin/rsync -aru /path/to/dir/ remote:/path/to/dir

Here is a neat little trick. Lets say she deleted her xmas pictures directory by accident and you need to restore:

/usr/bin/rsync -ru remote:/path/to/dir/pics/xmas /path/to/dir/pics

Note: rsync will never delete files, it will only overwrite existing files. What I typically do for deleting files is use two rsync scripts. One that runs as often as hourly for some of my production SMB fileservers at work, then another that runs maybe weekly (after the tape backups). What this script line will do is delete files on the remote side that have been deleted locally.

/usr/bin/rsync -ru --delete /path/to/dir/ remote:/path/to/dir

Finally, some logging so that somebody knows what has been going on. I typically use this stuff for backing up filesystems or even for syncing my backup dumps to a central server, so I want the verbose option to throw what it did in a log message and shoot it to me via email. I’m pretty sure the below command works, I’d have to reference one of my backup scripts to be sure the syntax is perfect, but you get the idea.

/usr/bin/rsync -ruv /path/to/dir/ remote:/path/to/dir | mail -s “Subject Line Message” myemail@mydomain.com