I have a test home website with a document root at /home/webname/public_html.
The test site is also an ftp server and is accessible via vsftpd.
I’m using wget to backup the files with a command like this from a remote client:
That nicely downloads the entire contents of the document root and subdirectories recursively EXCEPT it always hangs mid way through.
The process seems to just stop. Each time I try it the process stops on the same file. If I delete the file from the server (just to see what happens) and repeat the “wget” command from the client, the process proceeds further than the last time and then hangs on a different file.
I don’t know how to answer than, not knowing much about ftp / vsftpd. I looked in vsftpd.conf and this is commented out: #anon_max_rate=7200. I don’t know if that implies some other default value is active. The actual contents are:
Also if you are running vsftpd under xinetd, xinetd will also have rate-limiting to protect the system from too many ftp requests in a given period. If so, you might want to run vsftpd standalone. Have a look in /var/log to see if there are any clues in any log file.
I add -T 3 to all of my wget commands these days in case the server is
stupid. See the man page for info but basically after three seconds of
stupidity wget tries again. I hit this a lot because of bad
configurations (now fixed I believe) with some NAM/iChain implementations
but that’s still useful for timeouts that can be pushed past with a retry.
Give it a shot.
Good luck.
swerdna wrote:
> Hi
> AB has suggestted that I experiment with wget to back up websites
> remotely. The suggestion is here: ‘SOLVED: How do I connect ftp in one
> command - Page 2 - openSUSE Forums’ (http://tinyurl.com/lt38lk)
>
> I have a test home website with a document root at
> /home/webname/public_html.
> The test site is also an ftp server and is accessible via vsftpd.
> I’m using wget to backup the files with a command like this from a
> remote client:
> Code:
> --------------------
> wget -r --ftp-user=username --ftp-password=password ftp://testdomain.blah.com
> --------------------
>
> That nicely downloads the entire contents of the document root and
> subdirectories recursively EXCEPT it always hangs mid way through.
> The process seems to just stop. Each time I try it the process stops on
> the same file. If I delete the file from the server (just to see what
> happens) and repeat the “wget” command from the client, the process
> proceeds further than the last time and then hangs on a different file.
>
> Here’s an example of the dialogue when it stops:
> 100%===============================================================================================>]
> 16,912 --.-K/s in 0.007s
>
> 2009-06-25 14:19:20 (2.25 MB/s) -
> `domain.name.com/webname/public_html/dynamic/snapshot3.png’ saved
You may have to create it manually. It might be a server that doesn’t automatically create log files. Make sure it is writable by the account running vsftpd, but this would probably be root anyway.
The log file was not recording because I omitted to switch it on in vsftpd.conf. Once I did that and restarted vsftpd, it recorded. But this didn’t throw any light on the problem.
However I found that by shortening the recursion depth the problem disappeared. So there are two ways to work around the problem:
It may not be wget or your server. I’ve seen a situation where connections to public servers inside a enterprise LAN would break in the middle of transsfers (ssh, http, media streaming, you name it). The catch was that it only happened at high speed, so only the fraction of people with cable or ADSL2+ would experience the problem. I looked everywhere and finally concluded it could be some router dropping connections on congestion. I was wondering how to collect evidence of this when it came good and stayed good. Coincidentally infrastructure was moved/upgraded that weekend. So maybe not coincidence. I guess I’ll never know, but I’m not complaining.
So black magic workarounds like these are sometimes necessary.
swerdna wrote:
> ab@novell.com;2004216 Wrote:
>> I add -T 3 to all of my wget commands these days in case the server is
>> stupid. See the man page for info but basically after three seconds
>> of
>> stupidity wget tries again. I hit this a lot because of bad
>> configurations (now fixed I believe) with some NAM/iChain
>> implementations
>> but that’s still useful for timeouts that can be pushed past with a
>> retry.
>> Give it a shot.
>>
>> Good luck.
> That fixed it – and the prize goes to ab@novell.com
> Many thanks
>
>
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2.0.9 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org