Hello fellow members, few days ago my website micromkv(dot)com went down for like 8 hours actually the data center blocked it as I forget to remove an abuse but it came back online later in the day! That day I got a message in Google Webmaster Tools - 146 failed attempts to crawl your site! They also mentioned when your site comes back online, fetch your Robot txts in fetch as Google bot to let us know that the site is working fine. Well, I fetched the robot txts twice and it's been like a week it keeps showing your robot txts are Google couldn't crawl your site because we were unable to access the robots.txt file! Here is my robot txt file - http://micromkv.com/robots.txt All help will be appreciated! My website is no where to be found in google and before that block from the data center it was working fine in google! http://prntscr.com/1urtsa
I was able to load your robots txt file fine, and it was formatted correctly etc. When you try to fetch it do you still get an error? The site is indexed so not sure what you mean by it is nowhere to be found in Google...
this format is correct if you have any problem with this you can try this change your sitemap address if you have another thanks
That screen shows historical errors... not live. This is very typical with Google Webmaster Tools, the data can often be "old". As long as Google can get your site now okay- then don't worry. Also you can ignore the advice above, it is not correct for a WordPress site as yours is.