Hi all, I'm working on a site and within webmaster tools I'm trying to add a sitemap. Even though there is no robots file there. I removed it two days ago. I get the error: Googlebot is blocked from http://www.adgrenadespeedppcreviews.com/ Even when I click on: Analyze robots.txt it shows the robots.txt file from two days ago. I have other sites on this server without problems. Any ideas how to fix it? Thanks Mike
Yes the google webmaster tools is first of all not very intuitive. You also have to understand that most of the information in there about your site is gathered little by little by google spiders when they visit your site, and this is very slow. Same way, information that is removed on your site takes time to be updated by the google webmaster tools. Check on the last download time is after your removal. Regarding the robots information, did you removed the robots.txt from your site already? did you also removed the file google**************.html? and the corresponding metatag?
Yeah I removed everything I thought would cause the problem. So what solved it finally was "forcing" googlebot to come take another peek at my site. Logic really. Nothing changes until googlebot comes to re-index the dite without the robots text. Until then it just uses its cached copy. Thanks anyway
You must wait until Google updates the robots.txt file in WMT before submitting the Sitemap again. Slow down on your WMT actions. Google is not real-time and they will get to it when they want to. Currently Google is very slow for some reason. But to get impatient with them will just cause errors that will cause longer delays for you.