yahoo indexing bots are really getting annoying, 2 days ago there we're like 300 yahoo bots in the same time on my forum....why aren't 1-2 bots enough like at google?
I had around 140 at one time a few days ago. Is it an error or are they really just sending me that many bots? If they are, what a drain on my bandwidth.
Yahoo Slurp is a very greedy spider - and usually Yahoo won't deliver much traffic to a forum. Put the following into your robots.txt: User-agent: Slurp Crawl-delay: 15 What this means is that Slurp will still crawl your site, but you'll cut down the number of spiders simulatenously crawling the site and your guest numbers will fall to more reasonable levels. Adjust the Crawl-delay as necessary. 15 is probably a good starting point. Increase it to cut back further on spidering, reduce it to increase the rate of spidering. It might take 4 - 24 hours before you see your robots.txt having an effect on guest numbers.