Page 1 of 2 12 LastLast
Results 1 to 10 of 19

Thread: Local copy of web site

  1. #1

    Default Local copy of web site

    Hi

    Does anyone know of an application for making copies of web sites that can be read offline? I've tried using wget, but with very mixed results. Something a bit more reliable would be useful

    Thanks



    David

  2. #2
    Join Date
    Jun 2008
    Location
    Podunk
    Posts
    26,670
    Blog Entries
    15

    Default Re: Local copy of web site

    Hi
    Firefox should be able to do that for offline reading? Preferences -Advance - Network tab you can configure the cache and website.
    Cheers Malcolm °¿° SUSE Knowledge Partner (Linux Counter #276890)
    SUSE SLE, openSUSE Leap/Tumbleweed (x86_64) | GNOME DE
    If you find this post helpful and are logged into the web interface,
    please show your appreciation and click on the star below... Thanks!

  3. #3
    Join Date
    Jun 2008
    Location
    UTC+10
    Posts
    9,686
    Blog Entries
    4

    Default Re: Local copy of web site

    I remember something called wwwoffle. Fortunately never had to use it.

    Some years ago, I heard or read a talk about some people in South Africa sending cached web content to remote schools on Compact Flash cards by the daily "milk run". That was such a valiant effort. Hopefully they don't have to do much of that nowadays.

  4. #4

    Default Re: Local copy of web site

    Hi there,

    I've actually had pretty good success with wget, and it allows a fair amount of options so you can tune it how you like. Perhaps it's just a matter of getting the syntax correct. If you want to browse the mirror you make, they you have to convert the links which wget will do for you. You can also specify the max number of directories to descend into, and an interval between pulling items (so as not to hit the site too hard), etc. What syntax did you try with wget - maybe we can just tune it so it works well for you.

    I'm assuming you don't have ftp access, and you just need to pull the content via http. If you do have ftp access, a pretty handy tool I've used is Mirror FTP Tool - Lyceum

    Cheers,
    LewsTherin

  5. #5
    Join Date
    Jun 2008
    Location
    Earth - Denmark
    Posts
    10,730

    Default Re: Local copy of web site

    On 02/05/2011 12:06 AM, billingd wrote:
    > I've tried using wget, but with very mixed results.


    i've not done it with wget in *years* but when i did it was sometimes
    a goog bit of work to figure out how to get the command line switches
    *just*right* for any particular site/task but once done it was just
    let'er rip...done, every time perfect--until something changed..

    but, i must admit that that was before sites were mostly just a series
    of static pages....now they are so very complex, with css and (for
    example) database driven pages built "on the fly" by scripts and black
    magic...sometime with double magic magic bouncing data in from several
    different locales all at the same time...an scripts galore (not to
    mention php and ruby in the sky with diamonds! ;-)

    otoh, i'm surprised that wget has not kept up with the advances in web
    site trickery....if they have, setting it up must take some thought
    and skill....and, luck.

    --
    DenverD
    CAVEAT: http://is.gd/bpoMD
    [NNTP posted w/openSUSE 11.3, KDE4.5.5, Thunderbird3.0.11, nVidia
    173.14.28 3D, Athlon 64 3000+]
    "It is far easier to read, understand and follow the instructions than
    to undo the problems caused by not." DD 23 Jan 11

  6. #6
    Join Date
    Nov 2009
    Location
    West Virginia Sector 13
    Posts
    15,703

    Default Re: Local copy of web site

    Depends on how the HTML is written. There should not be much problems if relative paths are used and the content is static. But if the content is generated on the fly and uses PHP or other scripts, portions or all the data is from a database or multimedia server. It may not be possible to get a locally only running HTML setup.

  7. #7

    Default Re: Local copy of web site

    -----BEGIN PGP SIGNED MESSAGE-----
    Hash: SHA1

    If nothing else Firefox can do this with right-click: Save Page As. Open
    the page from your Desktop (or wherever you saved it).

    I've also had family use this Firefox Add-On for longer-term saving and
    categorizing, though it wasn't necessarily made for this purpose:

    http://amb.vis.ne.jp/mozilla/scrapbook/

    Good luck.





    On 02/04/2011 06:36 PM, gogalthorp wrote:
    >
    > Depends on how the HTML is written. There should not be much problems if
    > relative paths are used and the content is static. But if the content is
    > generated on the fly and uses PHP or other scripts, portions or all the
    > data is from a database or multimedia server. It may not be possible to
    > get a locally only running HTML setup.
    >
    >

    -----BEGIN PGP SIGNATURE-----
    Version: GnuPG v2.0.15 (GNU/Linux)
    Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

    iQIcBAEBAgAGBQJNTMuUAAoJEF+XTK08PnB5lvcQALM/7ypdihzcGSnHJ4Z4A0Lq
    J+uCNTQy1xqCDMtLmcW1PByyLqJSRFzITbHzWDEhcDn0V5m0WzB2Chnj5ekQf/hi
    samadUbh3XPXVQf536Bwpqvk8mS/HNCIIOkXK5r5XAJ4L/MwR7F9cYF2j2PK+Bci
    NOlsKptMQzzZt3TlbtB3CJ9w8PeDX+nJ8Ha5yync29IeLLjVCyXagxVjNI4D51TP
    3wkRw8gqpuWY0rcP6dL9oNu7hlB8NPmTcTldQaf6x/0ye5j58zzyxVUlEJi5EYjU
    u8/KvT/6wTIDbkf9IkMF/pA0jv0zy6DXwSqwPpc3xzDAt3QYAWgs1JmcQ/uC1RLw
    ZE9uVn1+jRJSc/VLwL3/RPIi69HE67KwebITIEhSG0RzA66XFzDO4YTKYNDJhlla
    FffPUK4NSUPZEDfascj4cXp9Im/B0T8eEiISkW9avWx0c1AxXVmkLR29vF9NRjf3
    FyixX9KuDWtgxef0XC+JV6t5N9tDA0xXyKbTi0iLA6HKoGDFebIcPE+2dqs0nRb3
    dHOsoUUSCKx6p9m+xuLmBB0QteYV/Qs8pcg8nVHsk+V3OF8pL9PDO2UUcE22Tqkp
    TIWBro6LoLfbaAn1imlrhFdpYHzYlx73aT6EzrBk8ExQKSDlSzhVN7epASXHrFca
    +p2+FJ4jsVQOPisCJCMB
    =i11w
    -----END PGP SIGNATURE-----

  8. #8
    Join Date
    Jun 2008
    Location
    Netherlands
    Posts
    25,002

    Default Re: Local copy of web site

    I used HTTrack for this purpose. The package httrack is in the OSS repo for openSUSE 11.2 (and most probably also for 11.3).
    Henk van Velden

  9. #9
    Join Date
    Jul 2010
    Location
    Adelaide, Australia
    Posts
    963

    Default Re: Local copy of web site

    Quote Originally Posted by hcvv View Post
    I used HTTrack for this purpose. The package httrack is in the OSS repo for openSUSE 11.2 (and most probably also for 11.3).
    httrack is in the packman repo for 11.3.
    Desktop: Gigabyte GA-Z270-HD3 - Core i7 7700K - openSUSE Leap 42.2 KDE
    Laptop: HP EliteBook 8770W - Core i7 3940XM - openSUSE Leap 42.2 KDE

  10. #10
    Join Date
    Jun 2008
    Location
    Netherlands
    Posts
    25,002

    Default Re: Local copy of web site

    Quote Originally Posted by ah7013 View Post
    httrack is in the packman repo for 11.3.
    For 11.2 it is also in Packman and not in OSS.
    Sorry for the confusion.
    Henk van Velden

Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •