I have large list of image URLs and I need to download them automatically. I have URLs like: www.samplesite.com/image_1.jpg www.samplesite.com/image_2.jpg www.samplesite.com/image_3.jpg www.samplesite.com/sub_category/image_1.jpg www.samplesite.com/sub_category/image_2.jpg www.samplesite.com/sub_category/image_3.jpg ext. I have around 8000 URL's, so downloading each manually is not an option. Can some one suggest some software or way to download it all, preferably keeping structure of catalog (i.e. for last 3 rows creates folder "sub_category" and puts images there)
DownThemAll for Firefox might be worth a look at. https://addons.mozilla.org/en-US/firefox/addon/201/ ---- They're example links.
I use http://www.httrack.com/ basically it's a reverse ftp bot/scrapper and I think there's an image retrieval setting built in. I mainly used it to retrieve all the imdb movie and tv shows listed on that site, (all 300k urls) lol for a movie/tv engine I'm currently programming. The software is free, stable and well supported. ROOFIS
It came to my mind as well, I don't remmeber if it allows multiple links though. If memory serves me right, it doesn't allow downloading multiple images from multiple links at once. Plus, regarding OP's question, if you have to choose images it can be a lot more time consuming to identify the exact image you want to pick from a site once a software has downloaded all the images simultaneously. Remember that there's images for everything, from buttons, home screens etc up to banners and whatnot. Since you want to save images, that means that their URLs won't be stable (or they'd be the same image), so, I don't think you can automize this porcess.
Another unique down loader is Download Accelerator Plus (DAP), which has a free version as well as paid one. The free version does a good job of it, just input your links and it lines them up. It also detects broken links, can be used to upload and send files,and many more functions, you may try and just like it!