I simply cannot understand why Google is not being able to access my robots.txt file. Here it is : http://talkhobby.com/robots.txt I am getting the following error message on Webmaster Tools : Crawl postponed because robots.txt was inaccessible. But when I try the fetch as Google option on Webmaster Tools, i sometimes get a success and sometimes get a "temporarily unreachable" message. My hosting provider says everything is just perfectly fine. Help please !!
We had such problem for one of our client. For us the solution will be the deleting of robot file and then we use the robot file plugin and it work for us. Give it a try.