i dont know why each time, i submit my good sitemap, it returns with error saying Network unreachable: robots.txt unreachable <<<We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. >> please tell me what i should do, my robots.txt allows robots and also the sitemap works fine...dono why this is happening... and one more thing, my site is in a sub directory, is this the problem what iam facing? if the sitemap is in sub directory what shud i do? let me know thanks sam
make sure there is a sitemap file in your server, just type the url of your sitemap even if it is in sub directory, make a full path on submission... may be your .htaccess file is blocking, if you could access your sitemap or robots.txt by typing the path in browser's address bar, then i can't say exactly what problem is this.. sometimes Google also Sucks
I am getting the same error. So far, I couldn't find any way to resolve this. LMK if you get it resolved please
i also encounter the same problem. i dont think my htaccess block my robots.txt . any helps? -cypher.
i have the same problem: "Network unreachable: robots.txt unreachable" After five tests i finaly found solution: I prepare: my sitemap for google and yahoo with GSiteCrawler my robots.txt with GSiteCrawler and complete it with information about sitemap location. here is my example of robots.txt: # robots.txt file for http://www.hikenow.net/ # Generated by SOFTplus GSiteCrawler v1.23 rev. 286 / http://gsitecrawler.com/ # 20.2.2008 08:58 User-agent: * # No exclusions found, disallow nothing. Disallow: Sitemap: http://www.hikenow.net/HikeNowSiteMap.xml # end of file puting both on my www directory and thats all from http://www.hikenow.net
I know this is an old thread, but I'm having the same 'network unreachable' error on several of my sites that are hosted on the same account. I'm getting a 200 (success) status for the robots.txt on all of these sites. Should I just assume it's a Google issue and see if it works out on it's own...or is there something else I should check.