Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

wget with sub directories

  • 07-01-2008 9:38pm
    #1
    Registered Users, Registered Users 2 Posts: 719 ✭✭✭


    can someone tell me the command line option please to get the following site downloaded to my local machine
    http://www.linux.org/lessons/beginner/toc.html

    I'd like it to download the html links of this page. and when I try just
    wget http://www.linux.org/lessons/beginner/toc.html it only d/l's the one html file.

    should I first be trying to get the links from the html page into a txt file to fee into wget ? or can I download the site from a directory (with all files) pointof view?

    thanks in advance

    Fionn


Comments

  • Registered Users, Registered Users 2 Posts: 2,426 ✭✭✭ressem


    wget -rl 4 http://www.linux.org/lessons/beginner/toc.html

    Downloads the page and all files on it to a depth of 4 links.


  • Registered Users, Registered Users 2 Posts: 719 ✭✭✭Fionn101


    thanks for the reply ressem. I tried this but I get a permission denied error(fair enough)

    today I was thinking it would perhaps be best if I do visit each page on their site and save off the text,that way I get the end result I'd like (which is an off-line copy of this howto) and they (linux.org) would get to serve out their fair share of google adwords and keep the site alive.

    A win win situation I reckon. (ahem..)


  • Registered Users, Registered Users 2 Posts: 1,109 ✭✭✭Skrynesaver


    If you use the -k (--convert-links) switch the local copy is rebuilt for local browsing


  • Registered Users, Registered Users 2 Posts: 719 ✭✭✭Fionn101


    worked a charm, thanks .

    oonly annoying part was stripping out the google syndycation ads.

    Thanks all


Advertisement