Will robots.txt remove my crawl errors

Discussion in 'robots.txt' started by LanceT, Aug 9, 2011.

  1. #1
    Hey guys,

    I am getting a bunch of crawl errors in google webmaster tools. I'm wondering, will blocking those URLs/directories from Google in my robots.txt file remove the crawl errors from Google?

    Thank you!
     
    LanceT, Aug 9, 2011 IP
  2. Chuman

    Chuman Well-Known Member

    Messages:
    977
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    110
    #2
    I don't think it will, Robots.txt file is used to stop bot crawlers etc from search engine, you should set 404 error pages, when the crawlers find 404 then the pages will be de indexed from the search engine after sometime.
     
    Chuman, Aug 14, 2011 IP