Category: Errors for URLs in Sitemaps ‎(0)‎ HTTP errors ‎(0)‎ Not found ‎(0)‎ URLs not followed ‎(0)‎ URLs restricted by robots.txt ‎(5)‎ I'm using Google webmaster tool. I have a problem there. Problem is "URLs restricted by robots.txt ‎" Can anyone tell me how can I fix this problem. SO I'm looking for your kind solution. Thank You __________________ Nazir Hossain
Log into the webserver with an FTP program, download the robots.txt file - its going to be in the html root. Open the robots.txt file with a program like notepad and see what it says. Inside that file, it will say something like User-agent: * Disallow: / The disallow: / is the path to the files or directories that you do not want search engines to index. With that example, "Disallow: /" - google would not index anything in the html root. That means that your site would not get indexed. If you did not make the robots.txt file, you may want to consult whoever made the website before you make changes. Sometimes you do not want search engines indexing certain pages. If the robots.txt file contains restrictions, they were probably put there for a reason.
The restricted by robot problem is not a big one. Most of the time it is not related to your robots.txt but because you have marked those links as "nofollow". It will disappear by the time from my experienced.