Hi, I am looking for someone that can code a script that can do the following: Search Google for a keyword and then save all the URL's that the search returns into a MySQL database. For example if I searched for the term "toys" all the URLs that are returned from this result are then stored. The script would have to allow me to say how many X amount of pages I go into the search results. I would like the following information saved into the database: URL Date Added Is this something that could be done? Is there a easier way to doing this? If you could do this let me know and we can work something out EDIT: After looking about a bit something like this would be what I am looking for except not displaying the results, just dumping them into a Database or even a flat text file http://goohackle.com/scripts/google_parser.php
I could do this but you need to keep in mind that Google will limit you to x amount of searches a day. As a believer of whitehatness I will not be providing a solution (proxy or otherwise) that goes around this. Dan
Googles webmaster guidelines prohibits automated scripts/software and a site could get banned for this.
They USED to have a SOAP Api do do this with, but they dropped it without warning. Alexa API (which charges a small fee per search) will give you the same thing, but obviously without Google's algorithm. I wish I had the time, I'd say PM me for a quote, but I'm swamped right now. But, if you don't need google's precision, the Alexa sarch is pretty good. My amazon web service fee is something like 1 cent per month.
Of course they do, that's WHAT they do. The scrape and and analyze etc. They just take measures from keeping us from doing the same thing ... and of course, they're the billionaires, and I'm still happy I got my car paid off. BUT, they will block scrapers if they detect them, and that makes scraping Google unrelaible. So, I pay the penny to amazon. It's easier and more reliable.