I am planning to start a website with the collection of useful and regularly updated information, say, freelance projects for SEO professionals. I have started to gather the info using the traditional SE search. But it is time consuming and not always efficient. The key website value is certainly the information. My question is what are the tools, manuals or tips on efficient search of the required information on the Internet? Any chance to make it more or less automated? Any help is much appreciated.
Oh man, you have got to get the book, Perl and LWP. Or is it called "Using LWP with Perl"? Whatever the name, it's a thin book that will walk you through creating a Perl program that basically appears to be a person, surfing the Web. And as it surfs whatever site you specify, it can save the pages, parse the pages, log data to a database, etc. You could run that program as a cron job every hour, and it would do all the work for you (of course, you have a substantial investment of time up front). The other good option -- one that a lot of price comparison sites use -- is to get an agreement with the sites you need, and pull an XML feed from each. -Tony