I have set a 404 page and redirected some error pages for a long time. But in the google webmaster tool, there are still so many crawl errors. Who can tell me some reasons? I would be very grateful!
first check the number of pages that link to the page that gives you a 404, and see if you can fix them there does your site use parameters ? eg, ?page=123 if it does then i have noticed google randomly checking for other parameters (that might not exist) and randomly adding on numbers, eg if you have 1000 pages, google will randomly try to find the maximum, buy trying ?page=5000, it will 404, so it tries another number. it will also try ?product=123 etc. there is nothing you can do about these. if there are 0 referring pages then they will disappear. you could try to remove them from the google index.
1) It takes a long time for Google Webmaster Central to remove crawl errors, particularly if you have a small site that isn't crawled often. 2) You should consider redirecting them instead of just letting people 404, look up 302 redirect in Google. It's much better SEO practice than letting Google think you're deleting/removing pages. Good luck
I don't know how to define a site is small or large. It has 4000 pages. But I find the spiders don't crawl my site at every turn. I will submit the sitemap to google webmaster imediately.