Robots.txt and Google crawl error

Discussion in 'robots.txt' started by Neil UK, Jul 10, 2010.

  1. #1
    Hi,

    I recently sorted out my robots.txt and I checked it with an online checker and it said everything was ok. Today I was checking in my Google webmaster tools and it shows "URL restricted by robots.txt". This was on a page that was previously showing as a 404 not found error. Is this ok? Does it mean my robots.txt is doing its job, or is it another error I need to fix?
    Thanks
     
    Last edited: Jul 10, 2010
    Neil UK, Jul 10, 2010 IP
  2. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #2
    This is not a error, but it shows that you have restricted few urls to crawlers so that they can not access those pages.
     
    manish.chauhan, Jul 11, 2010 IP
  3. Neil UK

    Neil UK Peon

    Messages:
    20
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Thanks manish.chuhan,

    I thought it might be ok but I was confused why it showed up as an error.
     
    Neil UK, Jul 12, 2010 IP
  4. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #4
    I would not say it an error but a notification that Google shows to you about your website status.
     
    manish.chauhan, Jul 12, 2010 IP
  5. mattstevens40

    mattstevens40 Peon

    Messages:
    2
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I was promoting my website and It was one the top for the earth towne keyword but after some Google crawl my website and it shows 404 Not found page error in 3 pages and 401 error in one page.

    and 4 pages in Google index, can any one tell my the way to promote my website that Google index my all pages with earlier ranking.

    Please suggest me.

    Thanks
     
    mattstevens40, Jul 25, 2010 IP
  6. Neil UK

    Neil UK Peon

    Messages:
    20
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    I think the best thing you can do is to block google from indexing the 404 pages by adding them to your robots.txt. After you have blocked them, you can ask google to remove the bad urls using the site configuration/crawler access option in webmaster tools. If you add them to robots.txt, google will remove them. I did the same thing, but it takes time for google to crawl all of your site.

    Good luck
     
    Neil UK, Jul 25, 2010 IP
  7. mattstevens40

    mattstevens40 Peon

    Messages:
    2
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Thanks buddy for suggesting me, but the main problem is Google has show 404 error on my website home page and other pages, bcz of this I can't do it now.

    Do u have any other suggestion or If you can help me so please tell me how can I do it without losing my website.

    Thanks
     
    mattstevens40, Jul 27, 2010 IP