Slow Donwloads. 6 Mbit. Should take 2 hours

What can be done ?
The download starts fine but it loses speed.

I’m asking my friends to help me download the ISO :s

TY for your attention. BD

Hi
Find a mirror close to your location from the following URL and use that.
http://mirrors.opensuse.org/list/13.1.html

Hi
Sorry, thought you were after the openSUSE dvd. So it’s an image you created on SUSE Studio? How big is the file?

2 GB I really need a ISO with everything :confused:

Hi
Try wget rather than a browser eg;


wget -c https://susestudio.com/download/<lots_of_numbers_and_characters>/<your_image>

Else I guess their site is just running slow…

Facepalm… Thanks :slight_smile: hahhahaa

wget -r 1000 -T 300 https://susestudio.com/download/...

I suppose this results in 1000 tries with a 5 min interval, if it fails more tha that then it’s impossible.

On Thu, 26 Jun 2014 19:26:02 +0000, binarydepth wrote:

> Code:
> --------------------
> wget -r 1000 -T 300 https://susestudio.com/download/
> --------------------
>
>
> I suppose this results in 1000 tries with a 5 min interval, if it fails
> more tha that then it’s impossible.

FWIW, downloads here are not too bad; when I’m having trouble with a
larger download, I try using aria2 instead and do a parallel download
(aria2 does a good job of segmenting a larger download and retrieving it
in multiple parts, assembling it into the original as it goes -
regardless of protocol, generally).

You might give that a try. Something like:

aria2c --max-connection-per-server=4 --min-split-size=1M

That should cause it to do 4 simultaneous downloads, and download the
file in 1 MB chunks.

There are other options that may help as well, and it can also be used to
restart an aborted download (see ‘-c’ in the help for details) so you
don’t have to start over every time the download fails.

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

I changed command to :

wget -c -T 150 --tries=1000 <URL>

That’s great program. It’s the way to go when downloading large files. :smiley:

Thanks!

It’s doe now. Didn’t take note of the “try” number :p, sorry if you are curious. Will be testing tomorrow at most.
BD

I used 12 connections in 10MB packges while using WGET and ARIA won the race. :stuck_out_tongue:

aria2c --max-connection-per-server=12 --min-split-size=10M <URL>

(1024*2)/64=32MB per Split.

(1024A)/B4, where A = File size, and B = Max of connections allowed.

What do you think of that model ?

aria2c --max-connection-per-server=16 --min-split-size=32M

Cheers :slight_smile:

On Fri, 27 Jun 2014 20:06:01 +0000, binarydepth wrote:

> What do you think of that model ?

Ultimately, I think if the model maxes out your connection speed, it’s a
good model. :slight_smile:

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner

Aria should tell that if it doesn’t. Don’t you think ?

Of course many CLI users have enough common sense. I’m biased to precision but Aria would stop if the download already finished.

On Sat, 28 Jun 2014 03:56:02 +0000, binarydepth wrote:

> Aria should tell that if it doesn’t. Don’t you think ?

It’s difficult to judge how much bandwidth is available. TCP doesn’t
work like that - it has some built-in dynamic throttling based on whether
or not a request is sent and a response isn’t received, but there’s no in-
built way for the software to ask “how fast is my connection?”. That’s
why (for example) torrent speed throttling by the client is inexact; it
operates by denying the inbound data a response, so the sender will
throttle back on how fast the data is being sent in order to cut down on
retransmissions.

It also depends on how many concurrent connections the server is
configured to permit overall and per client. You might want to open 12
connections to the server, but if the server only permits 4 per client,
then you’re not going to get an optimal speed over 12 connections.
Similarly, if the server is configured to limit the amount of outbound
data being sent to an individual connection or to a specific client,
that’s also a factor.

As are the network links between you and the server. I guarantee you
that if you have a 30 Mbps connection (as I do) and you connect to a
server that’s got 10 Mbps or 100 Mbps worth of bandwidth available to it,
but there’s a high-latency 56 Kbps link between you and the server,
you’re not going to max out your connection. :wink:

The same is true even if you have a full 30 Mbps between you and the
server, unless you have a dedicated connection to that server, because
other people are using that bandwidth as well.

It’s not such a simple problem to solve, because networks aren’t simply
constructed. :slight_smile:

> Of course many CLI users have enough common sense. I’m biased to
> precision but Aria would stop if the download already finished.

Naturally it would stop if the download was done - there would be no more
data to send. :wink:

Jim


Jim Henderson, CNA6, CDE, CNI, LPIC-1, CLA10, CLP10
Novell/SUSE/NetIQ Knowledge Partner