Hey, I would like to add a new row to a table every time a visitor comes to my site - i want to store the referral URL, SE keyword (if available), ip, country and more. I did something that does what i need, the problem is that every time google, or other spiders crawl my site a new row is added.. How can i determine the visitor of a robot from a real person with PHP? Thanks
you need a text file with spider ips in it (google "list of spider IP addresses" or similar). Then read this text file and convert it to an array. Define the user IP prior to adding to database, but before the database addition segment, you can do: if (!$user_ip in $spider_array) { Code (markup): dont forget to add a closing brace AFTER the DB insert segment. This basically says if the user ip is in the array from your spider IP list, then dont add it. Hope that helps.
One way to do it would be to have mod_rewrite redirect "robots.txt" to "robots.php" in the background so that you could start a session when "robots.txt" is requested by well behaved robots like Google. That way when you're in the script that does the logging, you can just check for the existance of the $_SESSION variable you set in robots.php
joebert, your solution is so much more elegant. kudos lol - and i just saw your blog post too. I have bookmarked for future usage, many thanks.