Hey guys, I am getting a bunch of crawl errors in google webmaster tools. I'm wondering, will blocking those URLs/directories from Google in my robots.txt file remove the crawl errors from Google? Thank you!
I don't think it will, Robots.txt file is used to stop bot crawlers etc from search engine, you should set 404 error pages, when the crawlers find 404 then the pages will be de indexed from the search engine after sometime.