I want to write a desktop programme which will automatically crawl and save certain webpages for me on the internet.I want to ask in which programming should I write this kind of software.I want to run this software from my desktop computer and not on my website server.So in what language do I write...?
i'd definitely go with visual basic... i believe ie comes with a programmable interface object which can be incorporated into the visual basic project...
Visual Basic != Good enough. You might wanna use PHP from a localhost or you could try out some regex with C/Csharp/C++ or Assembly . My choices. I would've told you to go with Binary coding but you're probaby gonna die young anyways .
If you go with Visual Basic you can download the Express Edition from the Microsoft website, it's basically a cut down version of Visual Studio. For a desktop PHP application the best place to start is http://gtk.php.net/ Hope this helps
Simply wrap the GNU utility, wget, in a BASH script, or run as a cron job if you wish to automate. Else, simply run wget from the command line. If you're stuck with the common legacy OS everyone loves and respects , see the Windows port of wget. cheers, gary