Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie

Downloading a large number of files automatically?

Options
  • 23-03-2003 1:27pm
    #1
    Closed Accounts Posts: 10,921 ✭✭✭✭


    I'm trying to save a large number of files that are contained with the subdirectories of a particular website. Problem is that whilst the files are small and freely available there is literally thousands of them them to save and this takes time (not to mention patience!) when you have to click and save every single one of them by hand.

    So I was wonder if there was a program out there that you could tell to save EVERY SINGLE FILE past a particular part in a web URL

    ******************************************************
    eg if you have:

    www.takethis.com\directories\dir1\file1.etc
    www.takethis.com\directories\dir1\file2.etc
    |
    |
    |
    www.takethis.com\directories\dir2\file1.etc
    |
    |
    |
    |
    |
    |
    www.takethis.com\directories\dirN\fileN.etc

    then save everything past the root address that is available for download (and if possible copy the directory structures onto my harddrive ).
    ******************************************************

    So basically what I want is for everything in 'directories' and all it's subdirectories to be saved to my harddrive without me having to the 'legwork'.




    Any suggestions gratefully recieved.


Comments

  • Closed Accounts Posts: 14,483 ✭✭✭✭daveirl


    This post has been deleted.


  • Closed Accounts Posts: 20,346 ✭✭✭✭KdjaCL


    Try and find those little files in one large file and download that.;)



    kdjac


  • Registered Users Posts: 2,077 ✭✭✭parasite




  • Closed Accounts Posts: 16,339 ✭✭✭✭tman


    copy the url, open getright, press ctrl+b... sorted.

    use the ctrl key to select multiple files & the shift key to select multiple files between 2 points


  • Registered Users Posts: 683 ✭✭✭TenLeftFingers


    Teleport pro probably does the same job. I just paste in the address, tell it how many links deep you want to go and off you go. You can also tell it to ignore or filter certain file types. You can either create a browsable duplicate of the site on your computer, or just get the files.


  • Advertisement
  • Registered Users Posts: 928 ✭✭✭jabberwock


    Webreaper is like that. Give it an address and it pulls down everything as far in that you want onto your machine.


  • Closed Accounts Posts: 5,025 ✭✭✭yellum


    use the windows port of wget

    use the command wget -rnp www.website.com/directory


  • Registered Users Posts: 1,393 ✭✭✭Inspector Gadget


    Do any of these packages support pulling URLs from JS/VB scripts?

    I know it's something that's almost impossible to implement completely, but even if it looked for stuff like "src=" in scripts it would be handy for most rollovers, etc... I know Teleport Pro doesn't bother with any of this, but maybe, just maybe, one of the above does?

    Gadget


  • Registered Users Posts: 6,949 ✭✭✭SouperComputer


    i dont know, but you could try netants, from www.netants.com

    you can right click on a webpage and select "download all by netants"


  • Closed Accounts Posts: 10,921 ✭✭✭✭Pigman II


    Thanks everyone for your suggestions! I tried http://www.httrack.com/index.php and it did the trick for me.


  • Advertisement
  • Registered Users Posts: 2,077 ✭✭✭parasite


    what do i win ? :/


Advertisement