Hi Google experts, In Google's Webmaster tools. What do the Crawl errors mean? I got errors including: Not Found Restricted by robots.txt unreachable and another is In Sitemaps with 502 error How can i solve these kinds of errors? Thanks inadvance~~~
not found - the page referred in your site map does not exist restricted by robots - your robots.txt does not allow google bot to access page mentioned in your sitemap unreachable - server has timed out while google tried to access page mentioned in your sitemap 502 -no idea
Thank you atxsurf. You are so kind. Do you know how to solve these problems? Or, will these problems disappear some time later just like JDmag said. btw, thank you JDmag.
not found - check why non existent page is referred somewhere in your site (you should be able to see where it is referred from in the error details), fix that broken link or whatever restricted by robots - often not a problem if you have disable Google bot access to those pages on purpose unreachable - usually intermittent hosting problems, disappear by themselves
I found the unreachable and 502 errors were gone. But the not found pages become more. How can i check the error details and fix the broken link? (please forgive my stupid questions)
Hi, Not Found- Now your website ftp doesnot have this file, but it could be used in before, when search engine crawled this URL Restricted by robots.txt- Your administrator didn't allow to crawl this URL unreachable- At the maximum time server take to reach this URL, but it cannot be reached. For Not found error you can use 301 redirection, it could be useful to avoid 404 not found error
It means, as others have said, that you've got a robots.txt in your index folder which is blocking access to search engine crawlers, simply delete it, and next Google goes to crawl your website, they will index it.