-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
> Now I do not trust the tar+gzip process any more.
That you’re using tar tell me you are not new to computers, so you are
probably familiar with the random problems that can come into things.
‘tar’ and ‘gzip’ are both made to work on streams of data which has at
least one huge advantage: size doesn’t matter (at one point tar had a
limit of 8 GB files, though that was overcome a decade ago or so I
believe). Anyway, huge chunks of the world use this same combination
for all of their storage, and you will have better luck with tar+gzip or
tar+bzip2 than with zip for large files for one large reason: zip does
not support storing files >= 2GiB (zip64 does, but not sure how common
that is now).
The long and short of this is that before you rule in/out something you
should test it. If creation of the tar file works then extraction later
should work barring corruption that happened in the meantime (obviously
not the fault of ‘tar’ since the data are (or should be) at rest).
Having a checksum of the data before/after is a good way to see if
anything has changed. You could do this test simply at any time to see
if tar+gzip can handle your data since creating the archive and then
immediately extracting it proves the technology one way or another
quickly. You could even do this without taking any disk space:
tar -zcv /path/to/archive | tar -ztv
echo $?
If the lats line printed is ‘0’ then all was well. The above commands
create an archive but pipe the output directly to a ‘tar’ command
decompressing and testing the contents (without writing data anywhere…
just reading everything from disk and then basically throwing it away
while testing it).
> The tar+gzip process is very convenient to save space and archive
> folders, that’s why I used it. Probably by “Why archive?” you mean
> that I could have used
The person asking probably asked why do an archive in the first place.
The downside of any type of archive is that if your hardware (again,
this is my vote in your case since you seem to know what you’re doing
with the tar command in general) has a single random one-time problem
(it happens… look at all of the download failures people have of the
OpenSUSE ISOs for some reason on otherwise-reliable Internet
connections… bigger just makes it more obvious which is why checksums
exist) can corrupt the entire archive. Storing files individually
removes the risk of a single byte affecting gigabytes of data. It’s a
tradeoff, but one that I use as well (rsync backs up my stuff… no
archiving with ‘tar’ because it just means more work to create/extract
the archive).
> zip -r …
Huge files is not the strength of the ‘zip’ format. Chances are that
most of your sensitive data out online somewhere are being saved with
‘tar’ more than ‘zip’; it’s worth going with ‘tar’ if possible because
its track record is just that good.
Good luck.
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.15 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
iQIcBAEBAgAGBQJO33CcAAoJEF+XTK08PnB5sRcP/AizvXrvWzK8/2HXz4+CpyZ6
DoMd5Yx66tz4SqqBmAqtlMHUmYXcZqfobhtr0JqwIlE6WU1lc5GSvPEqvG0WOrP3
KbeLTR/Iq+PqaLstsuf8weOHmaCZ9C4uBwsef/MpK+zsWq3qfPQrBRFQFmPSoMDt
sZ6OHJWgsNJbTV9Z3AGr/+o9XvVjhlq7ikdDopIA2mqhNDymO5TtQrG3QeDGX7/I
BsaBcW3aLMOPZ7cUPfEJ9/iOUmwcrGWo68ZOr72Shi44ooq6oXQQe8RNGGwh910C
WXXL8lqsXuWV2onwPs64tGzEh2kWOO4wL0KQZH9nQXhMQ5JYOoQwSm0yvZxxoZjH
mxdbGLR0W9ANAUUNiP8QMf6z9hyn+LWBkPcFTjkRp0jVLh0CQ7fp8wC7uWTefJ1R
kLmhBCCJOuxiDTIeVnBR1yw55fAIo8gLwYhvXt6DRC5li7tA207U39kjv6qeLRno
z8w9XHoaxxmfXilR99fH5y4V2qNMMXxDe2Ur6No3LnTApAQyrq2oqcxL/gu8WLn3
Bo9n4/uAOY0BgstB+0POWWYqV+NDonEixHZRgwZj0cl8xJtqq3pAX34C/jNp6/qx
s6EK5QbWRUnNoNIW2BLR8x9wFCURN4vckD4quW55Wud3edQmJecro6VyiAeigH7E
CTv3neX2KA5HG08juZif
=x9Uf
-----END PGP SIGNATURE-----