I found a very unusual robots.txt file User-agent: * Crawl-delay: 5.00 Can anyone explain what is it.
Crawl Delay is used to set the minimum delay in seconds between successive crawler accesses. You can use this to remove stress from crawlers on your server.
Sort of a pointless tag. I mean, how often do you NOT want your site being crawled. You have to have a really weak server to be worried about having too many spiders on your site.
Or you could have 1,000,000+ pages of content which would then require crawler delay if you enjoy server stability