1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Googlebot can't access my site (Robot Txt)

Discussion in 'robots.txt' started by yourarnav, Sep 25, 2013.

  1. #1
    Hello people, yesterday my site went down for 7 - 8 hours and today I got this message from Google -

    Over the last 24 hours, Googlebot encountered 146 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 9.6%.

    Here's my robot txt - micromkv.com/robots.txt

    Anyone guide me what to do? How to fix this error?

    Thanks!
     
    yourarnav, Sep 25, 2013 IP
  2. webdev007

    webdev007 Active Member

    Messages:
    1,037
    Likes Received:
    13
    Best Answers:
    3
    Trophy Points:
    88
    #2
    Nothing to do from your site. You got the message as your site was down when Google tried to fetch the robots.txt file. Wait for a day or two and all be back on track again.
     
    webdev007, Sep 25, 2013 IP
  3. yourarnav

    yourarnav Greenhorn

    Messages:
    18
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    23
    #3
    All righty, Thanks for the help!
     
    yourarnav, Sep 25, 2013 IP
  4. yourarnav

    yourarnav Greenhorn

    Messages:
    18
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    23
    #4
    Well, after two days now GWT says crawl postponed as tobots txt were inaccessible and get the yellow exclamation sign in front of robot txts in gwt.
     
    yourarnav, Sep 27, 2013 IP
  5. shakiba.proba

    shakiba.proba Active Member

    Messages:
    264
    Likes Received:
    8
    Best Answers:
    1
    Trophy Points:
    88
    #5
    Hi,

    Please put it... before you start crawling...

    User-agent: *
    Disallow:

    when crawl will be completed put this code on robots.txt again

    User-agent: *
    Disallow: /wp-admin/
    Disallow: /wp-includes/
     
    shakiba.proba, Oct 9, 2013 IP