When I first create my sitemap, I submitted it to Google. It told me I have many crawl errors. So I revised my sitemap and resubmitted it. However, there are still many errors like "Sitemaps 3;Not found 94;Restricted by robots.txt 3;Unreachable 3... I can't believe it because I just submitted 70 URLs and it said 67 URLs were indexed. Then why there are still so many errors. Weird. I use a third party software which is Joomla AceSEF component to create sitemap but I don't think it is software problem. Btw, the keywords analysis in Google Webmasters Tools seems not correct as well. For there are many irrelevant keywords which didn't exist in my webpage. such as "syz", "fixed"... Have you ever met such problems? Thanks for your advice.
All those Not Found are not necessarily from URLs in your XML sitemap. They can be from other sites linking to you.
Thanks ThomasSchulz. After Google webmaster tools updates, the crawling errors are decreased by a large extent. The current crawl errors on "Not Found" are 16. However, the "Not Found" errors were caused by my previous incorrect URLs link which shouldn't exist. Weird. Maybe it's due to google webmaster tools updating schedule are not synchro to its search engine updating schedule. Generally speaking, Google webmaster tools seems to update every 3 days while Google search engine updates every 3 months. Is that right?
I agree with Thomas Schulz. Sometimes errors are from external links. The link could be broken or the link does not exist anymore.