-----BEGIN PGP SIGNED MESSAGE-----
> Now I do not trust the tar+gzip process any more.
That you’re using tar tell me you are not new to computers, so you are
probably familiar with the random problems that can come into things.
‘tar’ and ‘gzip’ are both made to work on streams of data which has at
least one huge advantage: size doesn’t matter (at one point tar had a
limit of 8 GB files, though that was overcome a decade ago or so I
believe). Anyway, huge chunks of the world use this same combination
for all of their storage, and you will have better luck with tar+gzip or
tar+bzip2 than with zip for large files for one large reason: zip does
not support storing files >= 2GiB (zip64 does, but not sure how common
that is now).
The long and short of this is that before you rule in/out something you
should test it. If creation of the tar file works then extraction later
should work barring corruption that happened in the meantime (obviously
not the fault of ‘tar’ since the data are (or should be) at rest).
Having a checksum of the data before/after is a good way to see if
anything has changed. You could do this test simply at any time to see
if tar+gzip can handle your data since creating the archive and then
immediately extracting it proves the technology one way or another
quickly. You could even do this without taking any disk space:
tar -zcv /path/to/archive | tar -ztv
If the lats line printed is ‘0’ then all was well. The above commands
create an archive but pipe the output directly to a ‘tar’ command
decompressing and testing the contents (without writing data anywhere…
just reading everything from disk and then basically throwing it away
while testing it).
> The tar+gzip process is very convenient to save space and archive
> folders, that’s why I used it. Probably by “Why archive?” you mean
> that I could have used
The person asking probably asked why do an archive in the first place.
The downside of any type of archive is that if your hardware (again,
this is my vote in your case since you seem to know what you’re doing
with the tar command in general) has a single random one-time problem
(it happens… look at all of the download failures people have of the
OpenSUSE ISOs for some reason on otherwise-reliable Internet
connections… bigger just makes it more obvious which is why checksums
exist) can corrupt the entire archive. Storing files individually
removes the risk of a single byte affecting gigabytes of data. It’s a
tradeoff, but one that I use as well (rsync backs up my stuff… no
archiving with ‘tar’ because it just means more work to create/extract
> zip -r …
Huge files is not the strength of the ‘zip’ format. Chances are that
most of your sensitive data out online somewhere are being saved with
‘tar’ more than ‘zip’; it’s worth going with ‘tar’ if possible because
its track record is just that good.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.15 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
-----END PGP SIGNATURE-----