Crawl Delay using Robots

Discussion in 'Search Engine Optimization' started by akhilesh243, Nov 19, 2007.

  1. #1
    I found a very unusual robots.txt file
    User-agent: *
    Crawl-delay: 5.00
    Can anyone explain what is it.
     
    akhilesh243, Nov 19, 2007 IP
  2. astup1didiot

    astup1didiot Notable Member

    Messages:
    5,926
    Likes Received:
    270
    Best Answers:
    0
    Trophy Points:
    280
    #2
    Crawl Delay is used to set the minimum delay in seconds between successive crawler accesses. You can use this to remove stress from crawlers on your server.
     
    astup1didiot, Nov 19, 2007 IP
  3. Styxbowl20v

    Styxbowl20v Active Member

    Messages:
    231
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    78
    #3
    Sort of a pointless tag. I mean, how often do you NOT want your site being crawled. You have to have a really weak server to be worried about having too many spiders on your site.
     
    Styxbowl20v, Nov 19, 2007 IP
  4. astup1didiot

    astup1didiot Notable Member

    Messages:
    5,926
    Likes Received:
    270
    Best Answers:
    0
    Trophy Points:
    280
    #4
    Or you could have 1,000,000+ pages of content which would then require crawler delay if you enjoy server stability ;)
     
    astup1didiot, Nov 19, 2007 IP
  5. Styxbowl20v

    Styxbowl20v Active Member

    Messages:
    231
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    78
    #5
    Ok fair enough :eek:
     
    Styxbowl20v, Nov 19, 2007 IP
  6. astup1didiot

    astup1didiot Notable Member

    Messages:
    5,926
    Likes Received:
    270
    Best Answers:
    0
    Trophy Points:
    280
    #6
    Remember, whats useless to one person is gold to another :D
     
    astup1didiot, Nov 19, 2007 IP