Dear members...this is my first query. My website have 216 crawl errors when checked with google webmaster tools but the problem is there are no such files in the server but they are pointing it as errors. How do I get rid of this & how this affect my website ? Please clarify me.
Yes, if the search engine robots cannot accurately crawl your site it will hurt the amount of traffic that you get and ultimately how well your site does. They are used to check links and and meta tags and a few more things, but if they cannot access that, your site will not do well.
i have met the problem as you do, but i didn't find a effective way to correct them. Have you verified your website in W3C? And you can find the error documents in FTP.
lonking..I never tried it from w3c. I will check and come back. But most of the errors weren't on the server. Why this happens..that makes me go mad.
Since 216 is a big number, i think you'd better check it out. Or you can try the software xenu, it will list the links of your website, and tell you which are not effective.
It depends, are you using SEOMOZ at all? The free membership candiagnose crawl errors and recommend many ways to fix them.
Thanks lonking. Carter...I never tried seomoz..I will diagnose them and come out if still the problem persists.
If it's just page not found 404 errors Google says not to worry about it, it won't affect your site. If the missing pages have pagerank you should redirect them to other relevant pages
i guess you can just leave the 404 errors, ... when search engines crawl your site, they won't find the pages which you had error
Check with the tools like Xenu for broken links and than fix them accordingly. The number seems to be big but if you try hard, those errors will be reduce.
Thanks a lot stubsy..I tried it in many ways and can't find the right answer and your answer make my weekend good. Thanks a lot