If you are getting the error messages (Network unreachable or robots.txt not reachable) in the Google Webmaster's tool center, then you should understand that there is some real problem on your server and not a glitch in the googlebot system. I had this error come up and ignored it considering that its a glitch in google system as Google mentions it in their help section. The googlebot didn't crawl my site for more than 3 weeks and this got me really worried as I was losing the SERPs gradually as days passed. So today I sat down to find out what is the real issue and found out that one of the google ips were denied access in the server firewall. Using that ip google was pinging my server for the robots.txt file but since the ip was denied access in the iptables, googlebots were not able to access the robots.txt file and I was getting that message. So here is what one must do to iron this error out from his webmaster tool center. 1. Do iptables-save &> /home/username/temp-ip-tables.txt check all the ip addresses in the /home/username/temp-ip-tables.txt file using arin.net/whois and see if any of those ips belong to google. If any ip belongs to google then make sure you remove it from iptables (csf.deny file) which may vary depending on what firewall you are using. This was a nightmare for me, which finally I think its over, but who knows what's next Hope this mini-guide will help others as I didn't get an authorative answer anywhere else.
usually it is because the server blocks google bot from accessing your website. i was having the problem before and i asked my hosting provider to close the security module for my site. and it works. google bot can access my site without having any problem. -cypher.
I had the same problem and asked my host to look into google bot access on the server but they didn't believe me and thought it was my sites. After getting no where for 2 weeks, i switched hosts, got dedicated IP and bam google is now back crawling my sites. The new host understood the problem i had when i left old host. Suck's loosing about 2 weeks of google bot crawling. ketan9, thanks for posting the guide. A lot of people need this solution.
some providers do not know that they had caused the problem and blamed to us as the user. btw in case if you guys want to know, my site was not crawl by google for about two months. but after i told them to remove the security thing, then my site is back into google. -cypher.