Is there anyway the quickly scrape the first 1000 results of a search query and export all the URLs? For example I would search "Digitalpoint" and all the 1000 URL results of that query would be exported? any software that does this?
I have been looking for something similar for a long time, I would like to get the title of all sites listed in a [G] search engine for a certain keyword and export them to a txt file for example http://www.google.com/search?hl=en&rlz=1T4GGIH_enUS241US242&q=allintitle:cars&btnG=Search all of the titles would be exported to a list.txt file If you are a programmer and can do this PM me with a quote thanks,
You should grab a freelance programmer. this would be an extremely easy thing to write in php using curl as long as G allows the connection and search. curl can grab the entire source of the page and using php you can strip it out however you want. It's also much lighter than you'd think. I can't imagine this would cost much at all.