Dear all! Last week, my website was down for about 2 days due to problem in my web hosting server. After I transferred to other server, it is now up since last 3 days but in webmaster I received following problem. Google couldn't crawl your site because we were unable to access the robots.txt file.robots.txt. Now, what should I do? Any idea or suggestion would be welcomed. Regards
Delete and resubmit your robots.txt And yeah, it may take time to recrawl your site but eventually Google will.
you don't need to modify your code, also you don't need to resubmit, just keep update your site and ping to all search engines and if you have Feed then submit to everywhere, where it is possible, if you need to check perfect robots.txt file check it. http://seriousdatings.com/robots.txt