any possible solutions to get rid of this bots permanently... i find around 100+ spiders crawling on the entire server at any time !
They ignored my robots.txt also. I ended up putting this at the top of my sites, its PHP btw hope it helps. $checknaughty = $_SERVER['HTTP_USER_AGENT']; if($checknaughty == "psbot/0.1 (+http://www.picsearch.com/bot.html)") { echo "PISS OFF PSBOT STOP IGNORING robots.txt, your clearly disallowed now learn to read"; die(); }
For Gigablast you could also add this to the page: <meta name="gigabot" content="noindex,nofollow" /> Code (markup):
gigabot has completely stopped after disallowing from robots.txt but psbot sucks hightime.... i should say ! thanks for all your comments above
You can't ban any one using robots.txt. That's only a suggestion, like saying 'Please don't go here, but we can't stop you.' In the script, or .htaccess are probably the only ways to ban them.
Picsearch search spiders always respect robot.txt and we will immidiately address any problems that you inform us about. Please send an e-mail to info@picsearch.com and we will see that any problems get handled as soon as possible. Please go to http://www.picsearch.com to try our service. Picsearch takes robot.txt seriously and has a short text on our website at http://www.picsearch.com/menu.cgi?item=Psbot. Best Regards Carl Sarnstrand Communications Manager Picsearch