Is there a diwnload manager like this available on opensuse?

hi,
I use a download manager (in Windows) with the following three features:

1> It lets me PAUSE downloads, and resume them.
2> It doesn’t stop downloading the file if I close my browser.
3> It creates multiple connections to the server and downloads the file in pieces, and then when finished, it re-combines those pieces into the real file.

Example: IDM or Download-Accelerator-plus

Is there something like this I can have in opensuse?
Thanks …

On Fri 10 Apr 2015 02:36:02 PM CDT, johnwinchester wrote:

hi,
I use a download manager (in Windows) with the following three features:

1> It lets me PAUSE downloads, and resume them.
2> It doesn’t stop downloading the file if I close my browser.
3> It creates multiple connections to the server and downloads the file
in pieces, and then when finished, it re-combines those pieces into the
real file.

Example: ‘IDM’ (http://en.wikipedia.org/wiki/Internet_Download_Manager)
or ‘Download-Accelerator-plus’
(http://en.wikipedia.org/wiki/Download_Accelerator_Plus)

Is there something like this I can have in opensuse?
Thanks …

Hi
Use wget with the -c -nc optionsq from the command line…?

There are lots of options, check out the man page… :wink:

Or maybe uGet or kget both in the release if your wanting a GUI.


Cheers Malcolm °¿° LFCS, SUSE Knowledge Partner (Linux Counter #276890)
SUSE Linux Enterprise Desktop 12 GNOME 3.10.1 Kernel 3.12.39-47-default
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below… Thanks!

Besides uget which Malcom mentioned, a very popular cross-platform d/l manager is Filezilla.
You can probably get more recommendations using a Google search like “linux download manager”

I personally use JDownloader to download video files from YouTube and can probably be used for other file types. As a Java app, it has extremely heavy overhead, but it “just works” for every scenario I’ve needed.

Meta-downloaders like aria2 have the added capability for downloading files using multiple protocols simultaneously if available.

TSU

On 2015-04-10 18:36, tsu2 wrote:

> I personally use JDownloader to download video files from YouTube and
> can probably be used for other file types. As a Java app, it has
> extremely heavy overhead, but it “just works” for every scenario I’ve
> needed.

I use it on a old and small laptop, runs quite fine. You can minimize
the GUI and it uses less resources. Actually, Firefox is heavier.


Cheers / Saludos,

Carlos E. R.

(from 13.1 x86_64 “Bottle” (Minas Tirith))

i would think that the issue would be that there really are no sites anymore that you can restart a download
just try it on
mega
rapidshare
k2c
and so on

it will not work

now if the site has a ftp server
use ftp
or if the site has a indexed html version of the ftp
like say JPL/NASA
or fedora 21
http://mirror.chpc.utah.edu/pub/fedora/linux/releases/21/Workstation/x86_64/iso/

wget is a great tool
but just try to use it on MEGA !

uget is great it has a nice clean gui, has lots of bells and whistles, it small you should really try it out. (they have a windows port too)
https://software.opensuse.org/package/uget
you can install it with sudo zypper in uget
a nice download manager for firefox build with javascript so it’s multiplatform is downthemall
https://addons.mozilla.org/en-US/firefox/addon/downthemall/

about sites like rapidgator or uploaded those are premium hosting sites and you need an account to use a download manager, mega uses strong encryption and does not send decoded files the decoding is done with your client (most likely a browser) and that’s why you can’t use a download manager with them.

depending on where you live your download speed and you surfing habits you might need a download manager, I use downthemall from time to time but in principle I don’t have the need or use a real download manager.

Thanks for your help guys, but all those download managers are MISSING an important requirement, the third one:
It should split the download into pieces, download these pieces in parallel, and when done recombine these pieces into the original file.
I searched for “IDM alternative for opensuse”, and came across THIS page.
It downloaded and installed fine, here’s the option I’m talking about:
http://s14.postimg.org/tbmg6mixd/snapshot19.jpg
MAXIMUM NUMBER OF SEGMENTS PER DOWNLOAD: 8
**
This would have been great, but although I downloaded the free version, it tells me “You have 10 days of trial left”**
What does it mean? will it stop working after 10 days? Is there something else you can suggest which offers this option and is not a trialware?

http://s28.postimg.org/ywm9gbw59/snapshot20.jpg
Enter key, it’s like I’m using Windows software again…
If there’s a free open source alternative, please suggest.
Thanks.

what are you talking about
from uget’s home page
http://ugetdm.com/features
Multi-Connection* (aka Multi-Segment): up to 20 simultaneous connections PER download - uGet’s Multi-Connection feature also utilizes adaptive segment management which means that when one segment drops out then the other connections pick up the slack to ensure optimal download speeds at all times. This also applies to segments that become drastically slow due to server limitations.

even downthemall splits a file for multisegment download

I keep forgetting abut it but kde has a nice build-in download manager called kget it even supports torrents

Thanks I_A,
I have a high-speed limit of only 10 GB per month, after I’m past the first 10GB, my connections is throttled and downloads get terribly slow, and it really speeds up a little if the software is creating multiple connections.
You’re right, the documentation link you provided does indeed tell that uget creates upto 20 segments for a file, but I don’t see a visual representation of that in the UI.
Take a look at this download:
Scenario 1, downloading with uGet:
http://s23.postimg.org/w4uefqo8r/snapshot21.jpg
There’s no information about how many segments this file is broken in to.
Download speed is 7.8KBpS

Scenario 2, the SAME file downloading with flareget:

http://s29.postimg.org/iueba47jr/snapshot22.jpg
There is a clear indication that the file has been broken into two segments, and the download speed is 13.7KBpS.

It looks like uGet is not working. I selected the option to allow 4 connections in uGet, but it still does the same, no indication about how many segments the file has been broken into, and the download speed is about half that of flareget, makes me feel like it’s not working…
http://s2.postimg.org/8kj7xuq3t/snapshot23.jpg
Is there something I’m doing wrong???

On Sat 11 Apr 2015 07:36:02 AM CDT, johnwinchester wrote:

I_A;2704216 Wrote:
> what are you talking about
> from uget’s home page
> http://ugetdm.com/features
> Multi-Connection* (aka Multi-Segment): up to 20 simultaneous
> connections PER download - uGet’s Multi-Connection feature also
> utilizes adaptive segment management which means that when one
> segment drops out then the other connections pick up the slack to
> ensure optimal download speeds at all times. This also applies to
> segments that become drastically slow due to server limitations.
>
> even downthemall splits a file for multisegment download
Thanks I_A,
I have a high-speed limit of only 10 GB per month, after I’m past the
first 10GB, my connections is throttled and downloads get terribly slow,
and it really speeds up a little if the software is creating multiple
connections.
You’re right, the documentation link you provided does indeed tell that
uget creates upto 20 segments for a file, but I don’t see a visual
representation of that in the UI.
Take a look at this download:
Scenario 1, downloading with uGet:
[image: http://s23.postimg.org/w4uefqo8r/snapshot21.jpg]
There’s no information about how many segments this file is broken in
to.
Download speed is 7.8KBpS

Scenario 2, the SAME file downloading with flareget:

[image: http://s29.postimg.org/iueba47jr/snapshot22.jpg]
There is a clear indication that the file has been broken into two
segments, and the download speed is 13.7KBpS.

It looks like uGet is not working. I selected the option to allow 4
connections in uGet, but it still does the same, no indication about how
many segments the file has been broken into, and the download speed is
about half that of flareget, makes me feel like it’s not working…
[image: http://s2.postimg.org/8kj7xuq3t/snapshot23.jpg]
Is there something I’m doing wrong???

Hi
You need to look at it from the interface (snmp, iptraf) and things like
netstat (to check connections) to verify real data at the interface…

Then check which application correlates with what the data provides on
the GUI. Just because program X says this and program Y says that
doesn’t means it’s accurate…


Cheers Malcolm °¿° LFCS, SUSE Knowledge Partner (Linux Counter #276890)
SUSE Linux Enterprise Desktop 12 GNOME 3.10.1 Kernel 3.12.39-47-default
If you find this post helpful and are logged into the web interface,
please show your appreciation and click on the star below… Thanks!

On 2015-04-11 07:36, I A wrote:

> Multi-Connection* (aka Multi-Segment): up to 20 simultaneous
> connections PER download - uGet’s Multi-Connection feature also
> utilizes adaptive segment management which means that when one segment
> drops out then the other connections pick up the slack to ensure
> optimal download speeds at all times. This also applies to segments
> that become drastically slow due to server limitations.

aria2c does that, too. It spreads the download over several servers.

However, downloading several simultaneous chunks from the same server is
considered aggressive, so some servers take measures to block this usage
(limiting connections per IP).


Cheers / Saludos,

Carlos E. R.

(from 13.1 x86_64 “Bottle” (Minas Tirith))

Filezilla and JDownloader should satisfy all your requested feature requirements.
Both support configuring

  • Multiple streams
  • Downloading file parts out of order and re-assembling. When I studied this years ago, this was accomplished simply by recognizing individual “chunks” based on the offset file address from the file beginning. So, this doesn’t require any kind of special Server configuration, it’s all done client-side and unlike torrents in which files have pre-determined chunks.
  • Pause/Restart. Is based on the same principle as above. The client understands which data chunks are still missing based on file address offset, and so can request specific parts of the file.

TSU

I would recommend aria2
By itself its not a good choice for regular download tasks.

But when you equip it with webui like this https://github.com/ziahamza/webui-aria2
It become one of most powerful and flexible downloader you can have without messing with deep configs.