copy website - kde

I installed httrack. This opens a new tab then stalls and won’t continue,

Also, I found you can do this with wget.

Need help here.

View>View Document Source (Ctrl-U) allows you to save the source code of a page.

FYI, This site hasn’t visited in 6 months.

I need download the whole site at once.

Are you able to post details,

When running httrack, have you tried modifying options? eg more or fewer simultaneous threads, resource allocation, etc.

If you’ve tried wget and run into a problem, describe in detail.

I’ve used both for different circumstances.
If I can’t download the entire website in one session or need fine grained control of options, then I’ll use httrack although it can take more time to set up and run than wget.

I use wget if I want something simple and fast, particularly if I’m using the same configuration against multiple URLs.


Be aware of the pitfalls here.

Downloading a “whole web site” is possible if the pages are linked to each other using URLs because then the download tool can try to follow those links (and avoid pages that are already found because many links on many pages can point to a page).

But nowadays many web sites are very dynamic and when scripts (e.g.Java) decide what URLs are to be asked from the server depending on all sorts of parameters the tool might not be able to trigger that.

Installed httrack through YAST - software management.
Click on deskop menu symbol. Search httrack. Click on httrack website copier.
New firefox tab opens with address linux-xp0f:8080 Stalls → connection timed out

I found a website showing a long sequence to download a whole website.

This looks the code to download a website correctly. However, i’ll ask permission first to download the whole website from the owner.

Is the above link correct? Does it need any more?

I’ll keep thar in mind.

Whatever linux bug was causing it, seems have been been fixed.

Thanks to all for the help. :slight_smile: