Well there was a post today on google blog for webmasters tating that some of the webmasters who try to stop spam bots are unknowlingly stopping google bot also to crawl their websites. Have a look at this post here googlewebmastercentral.blogspot.com/2008/02/7-must-read-webmaster-central-blog.html by Susan Moskwa
Really this is not good technique to avoid spams. By doing this method we will stop our indexing progress. There is an IP Blocking system by which you can stop or avoid the spam IP address.
I last month unknowingly blocked google bot from accessing my site... I am using this code in my robots.txt file to prevent urls containing "%" from indexing... Disallow: /*%* Disallow: /*% Also I am having some urls containing "$" symbol. So I thought just put another 2 lines containing "$" for blocking urls containing "$" symbols. Within 3 days I noticed that google bot is blocked in webmaster tools... Then I removed those "$" symbols...