I’m looking for a bash script that can do the following. My employer
does nightly backups of our clients data that are rarred for compression
reasons (a 5GB db compresses down to ~400 meg). What we would like to
do is recursively search through the backup folder and unrar the data as
when they call in on a issue we load their data to see what is going
on. The FTP has directory structure like this:
In each of the *day folders there are split rar files and the first rar
file is always .rar
What we want to do is to run a script that will unrar those files on
Tuesday, Thursday and Saturday. The cron part I know how to do. We
would like the extracted data to fall into the clients root directory
example and then delete the Monday/Wed/Friday folders if a rar file was found and extracted successfully :
Now there are roughly 500 clients that have this backup done so one
magical script that would do it for all the clients directories in the
/backup directory would be great. I should also note that the
subfolders in /backup are not always called client1, client2, etc etc
but are the names of the clients themselves.
What do you have so far? Which part is catching you up?
Good luck.
deanjo13 wrote:
> Hey all,
>
> I’m looking for a bash script that can do the following. My employer
> does nightly backups of our clients data that are rarred for
> compression
> reasons (a 5GB db compresses down to ~400 meg). What we would like to
> do is recursively search through the backup folder and unrar the data
> as
> when they call in on a issue we load their data to see what is going
> on. The FTP has directory structure like this:
>
> /backup/client1/010109/
> /backup/client1/010309/
> /backup/client1/010509/
> /backup/client2/010109/
> /backup/client2/010309/
> /backup/client2/010509/
>
> In each of the *day folders there are split rar files and the first
> rar
> file is always .rar
>
> What we want to do is to run a script that will unrar those files on
> Tuesday, Thursday and Saturday. The cron part I know how to do. We
> would like the extracted data to fall into the clients root directory
> example and then delete the Monday/Wed/Friday folders if a rar file was
> found and extracted successfully :
>
> /backup/client1/010109.db
> /backup/client1/010309.db
> /backup/client1/010509.db
> /backup/client2/010109.db
> /backup/client2/010309.db
> /backup/client2/010509.db
>
> Now there are roughly 500 clients that have this backup done so one
> magical script that would do it for all the clients directories in the
> /backup directory would be great. I should also note that the
> subfolders in /backup are not always called client1, client2, etc etc
> but are the names of the clients themselves.
>
> Any help would be appreciated
>
> Dean
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
for clientdir in /backup/*; do
cd $clientdir
for daydir in $clientdir/*; do
unrar $daydir/*.rar
rm -rf $daydir
done
done
After some browsing through the ftp’s directories it seems some have the first rar as .r01.
So ideally I would like to be able for it to search for the .rar extension first, if not found look for .r01 and try extracting it. I would like for it to only delete the directories if they were successfully extracted. If no archive was found to leave that directory as is.
You may want to check for sure but the last time I heard the .rar file is
always the first one and then .r01, .r02, r## files are just used so that
the individual files do not get over a certain, user-desired, size. If
you have .r01 files on their own do they even work? I’ve seen them on
their own once or twice but only when the .rar file that went with them
was lost. Extraction of some data may be possible but I can never get all
of it.
Anyway, that’s where I’d start. You could easily add conditionals in
there to check for .rar and then, if not there, check for .r01 but I
without knowing that is valid it won’t help you much.
Good luck.
deanjo13 wrote:
> Right now I got some help on the mailing list.
>
>
> Code:
> --------------------
>
> for clientdir in /backup/; do
> cd $clientdir
> for daydir in $clientdir/; do
> unrar $daydir/*.rar
> rm -rf $daydir
> done
> done
>
> --------------------
>
>
> After some browsing through the ftp’s directories it seems some have
> the first rar as .r01.
>
> So ideally I would like to be able for it to search for the .rar
> extension first, if not found look for .r01 and try extracting it. I
> would like for it to only delete the directories if they were
> successfully extracted. If no archive was found to leave that directory
> as is.
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
I have a script for you which will unrar in the same directory where the files were archived. I cannot understand from your explanation where you want the unarchived files to be created. The way it works now the script will place in the same directory the files and will delete the archives(rar and r01). If you give me more details I will modify it the way you need it.
Cheers
On 2009-10-19, dmera <dmera@no-mx.forums.opensuse.org> wrote:
>
> I have a script for you which will unrar in the same directory where the
> files were archived. I cannot understand from your explanation where you
> want the unarchived files to be created. The way it works now the script
> will place in the same directory the files and will delete the
> archives(rar and r01). If you give me more details I will modify it the
> way you need it.
Would that script be something lik this?
#!/bin/bash
unrar x $1
rm $1
–
Any time things appear to be going better, you have overlooked
something.