hey guys, I have a small amount of experience with PHP, and moderate amount of programming besides. I need to create/find a script that will grab every URL in a serp. I essentially want to enter in a particular search phrase and get the URL list dumped into a .csv file so that I can parse it by hand. I can't believe it would be all that hard. Can someone point me in the right direction? thanks, ~5yb
I wouldn't do it with PHP or any web application. Just create a windows application that does an HTTP request, pull out the URL's, and create a CSV.
It shouldn't be a problem to do with PHP but brentl is correct in that it would be a better solution as an actual application. Reason being that their is a lot better error handling and file management controls with an application. You would know instantly if it failed if you were using an app - whereas a PHp file would most likely silently fail or not spit out the CSV at all on error.
YOu could try using PHP command line interface if you have access to a shell of a linux or freebsd server or even in your own home computer
You could try php if you have a better knowledge at it than other languages. But by side, i'd recommend learning other languages too. Your application could also be a console one, if it has to just write the output into a file. So you'd only have to learn the string handling, and network functions to some extent. -ARonald