I created the robots.txt file placed this code in it: ----------------------- User-agent: *Disallow: / User-agent: Googlebot Allow: / User-agent: Yahoo-slurp Allow: / User-agent: Msnbot Allow: / User-agent: Bingbot Allow: / ----------------------- Everything is working fine but, if someone decides to avoid my robots.txt file then the whole thing has very little use against malicious bots. What command/file/script can be used to make ALL bots respect the robots.txt file?
Just found the answer to my own question. Added a bunch (over 200) bad-bot names to my site's directory. Still looking for more...