Hello people, yesterday my site went down for 7 - 8 hours and today I got this message from Google - Over the last 24 hours, Googlebot encountered 146 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 9.6%. Here's my robot txt - micromkv.com/robots.txt Anyone guide me what to do? How to fix this error? Thanks!
Nothing to do from your site. You got the message as your site was down when Google tried to fetch the robots.txt file. Wait for a day or two and all be back on track again.
Well, after two days now GWT says crawl postponed as tobots txt were inaccessible and get the yellow exclamation sign in front of robot txts in gwt.
Hi, Please put it... before you start crawling... User-agent: * Disallow: when crawl will be completed put this code on robots.txt again User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/