Hi, I recently sorted out my robots.txt and I checked it with an online checker and it said everything was ok. Today I was checking in my Google webmaster tools and it shows "URL restricted by robots.txt". This was on a page that was previously showing as a 404 not found error. Is this ok? Does it mean my robots.txt is doing its job, or is it another error I need to fix? Thanks
This is not a error, but it shows that you have restricted few urls to crawlers so that they can not access those pages.
I was promoting my website and It was one the top for the earth towne keyword but after some Google crawl my website and it shows 404 Not found page error in 3 pages and 401 error in one page. and 4 pages in Google index, can any one tell my the way to promote my website that Google index my all pages with earlier ranking. Please suggest me. Thanks
I think the best thing you can do is to block google from indexing the 404 pages by adding them to your robots.txt. After you have blocked them, you can ask google to remove the bad urls using the site configuration/crawler access option in webmaster tools. If you add them to robots.txt, google will remove them. I did the same thing, but it takes time for google to crawl all of your site. Good luck
Thanks buddy for suggesting me, but the main problem is Google has show 404 error on my website home page and other pages, bcz of this I can't do it now. Do u have any other suggestion or If you can help me so please tell me how can I do it without losing my website. Thanks