Hello, Do you have a robots.txt file on your server? Robots are used to indicate informations used by SE spiders. For instance you can tell spiders not to follow some directories (like image directories for example: Disallow: /images), or to name where your Sitemap is located (for example sitemap: http://www.yoursite.com/sitemap.xml). If this file is not present, google webmaster tool will alert you.
If you have robots.txt in the root of your domain then make sure that you can access it using http://yoursite/robots.txt.
Check your robots.txt using yourdomain/robots.txt if you see then check spelling in webmaster tools if clear then copy robots.txt file again in your server root folder
u can allways include robots in your html tags on web page that will make them visible for search engines
I have exactly the same problem with one of my site http://articles.topaix.com I have about more than 1000 page showing "robots.txt unreachable". I have tried checking robots.txt file and did some tweaking too and waited for about 1 months but nothing has changed. If you have any suggestion, please help me resolve this issue.
Are you sure your robots.txt file in the root directory of your site? If so, you use google site management tool then generates a new robots.txt file, put it on the root directory of your site, to replace the original trial next!