I have a blog and when i login in to webmaster to increase the traffic i saw Crawl errors and it says Restricted by robots.txt. Please let me know what to do with it.
robots.txt is a script which restricts the search engine crawlers to crawl your website... It is basically meant to be used on pages that you don't wat to get indexed or have some personal stuff on... Search for the robot script on the html of your blog and delete it if you want your blog to be crawled and hence indexed... Best Wishes...
This might help you: http://www.mwusers.com/forums/showthread.php?9494-Problem-with-sitemap-and-robots.txt
maybe is because your do not have one, wrong format robots.txt (for example, a HTTP 200 html file) will make your site unindexable