Logged into Sitemaps this afternoon see see a bunch of URLs no longer crawlable by Googlebot. The report reads: "Below are URLs we tried to crawl (found either through links from your Sitemaps file or from other pages) that we didn't crawl because they are listed in your robots.txt file. You may have specifically set up a robots.txt file to prevent us from crawling this URL...<snip> My bold" The thing is my robots.txt doesn't list any of the files that are no longer being crawled. Indeed when I test the forbidden URLs against the robots.txt file with the tool supplied by Google, they pass wthout problem! Simply bizarre Google behaviour? Anyone else having problems?
I've just noticed the exact same problem with my account today! I can't understand why this is happening because my robots.txt file also doesn't list any files not to be crawled. I wonder what's going wrong! :-(
hmmm I've found this post on the official Google sitemaps blog that says such 'errors' have now been resolved... I guess they have except for mine :-(( http://sitemaps.blogspot.com/2006/04/updated-robotstxt-status.html
Yeah I saw that post too -- the errors are still showing on mine, though I assume its just till googlebot tries again.