Hi, l was just looking for sitemap conversations. I have the same kind site map problem. I made many times sitemap but google cant read it and its not right sitemap file, l gave up many times and tryed again and again. Still google doesnt accept it. Can you pls help me? my site url is alparcade.com thanks from now...
VALIDATE your robots.txt http://tool.motoricerca.info/robots-checker.phtml study robots.txt http://www.robotstxt.org/ then see my How to create your sitemapindex.xml for your robots.txt
Hey hawksrnm1, Looks like you're having the same issue as Sensaay (diff post)... it's an easy one to make... 'cause the robots.txt file is kinda dumb IMHO. I would strongly recommend people NOT use the ALLOW tag in their robots.txt, because AFAIK only google supports it... which means you are 'implicitly' blocking spiders (which do not support the ALLOW tag) from crawling your site... Sure Google is 70% of the SE space.. but why ignore the rest? Here is what I would do... remove the "Allow: /sitemap.xml" and change the "Allow: / " to "Disallow: " So your robots.txt would look like: User-Agent: * Disallow: Code (markup): Here is the post I left in Sensaay's post: Added sitemaps now show ERROR Hope it helps, PD
I see that you never took time to validate your robots.txt NOR to correct existing errors. hence it appears that your robots.txt is of zero importance to you and the topic may well be considered closed.
I had the exact same problem..no robots.txt file and it was giving me the same error. I created the file and tried PhilD's code but it doesn't work: User-Agent: * Disallow: So I used the OP's code and it worked: User-Agent: * Allow: / Allow: /sitemap.xml I'm unsure if this is blocking crawlers like Phil stated..will have to look in to this in more depth...
the docu given on http://www.robotstxt.org/ is actually that simple that it's hard to belief some still never read it carefully just a few minutes to read and convert and all problems are gone forever!
What if there is no robots.txt file on your site at all? I'm having this very same problem, but I have no robots.txt file and never have. Does this mean that in order for Google to be able to read a sitemap file, there MUST be a robots.txt file?
Apparently, there must be a robots.txt file. I just added one to my site and re-submitted my sitemap and the errors are gone and the status is OK. I have no files or directories that I don't want Google to see, so my robots.txt file looks like this: User-Agent: * Allow: / Hope this helps someone.