Hello friends… I just received an email about having the title Googlebot can't access your site. Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 50.0%. How to fix this error? Here is the website - http://ignoumswcourse.blogspot.com/
Put this below code only in your robots file: User-agent: * Disallow: The sitemap url is not ok hence Googlebot when trying to crawl your pages through sitemap error is showing. Create a sitemap properly & add the url in robots file then.
create a new sitemap for your website and add it to the webmaster tools. The robots.txt should be like this, User-agent: * Disallow: