I have a script which im using on part of my site, which I think is stopping googlebot and other search engines from indexing those pages I use it on. Is there anyway I can implement a small script that allows google and other spiders to browse if it is in fact google? example... If user = human exec php script else dont exec script. Or maybe the other may round, Im not sure. Please help Thanks
What makes you think google cant index the pages ? Have you ever tried to use Googles webmaster tools, you can enter your site urls in there and read responses etc..
showing different pages to users and bots is called cloacking and you will soon or late get caught and penalized for that by the search engines
its a cpa survey. And I think it is stopping google from indexing the certain pages im using it on. Its not like im "cloaking" to hide malicious scripting or anything. Im doing it so google index's pages. My new pages get indexed pretty quickly without cpalead.
If its a javascript google wont see it, but you mentioned php so I'm confused. Anyway try this link and enter the page url your talking about: http://www.ongsono.com/spider_test.htm
if (preg_match("/google/i", $_SERVER['HTTP_USER_AGENT'])) { // show whatever you want to show to googlebot here } else { // show whatever you want to show to a normal visitor here } still, be carefull about the cloaking issue here .. also, know that google has some "radar" boats which don't identify themselves as googlebot in their useragent (or anywhere else) and also i do honestly believe they have human reviewers the point is that if there's a major difference in page viewed by normal googlebot and the rogue bot - you may get penalized