http://www.site-reference.com/articles/Search-Engines/Valid-HTML-Does-Google-Care.html Found this via ThreadWatch. Your thoughts?
No, googlebot is very advanced and doesnt have any trouble reading html even if its not w3 compatible. Valid html should be used for user experience and browser compability.
Code quality is not related to content quality. Your viewers don't care whether your code validates or not (provided that the page doesn't break!), they are interested in the content. Google is interested in serving the best content for a user's query.
Hmm..do u guys read the link at all? The link describes an experiment that was done that results in hints that INVALID html might in fact be better for ranking.
I think it is not simply syntax that affects Google rankings, but page structure. For instance, if you use bold, italic, or font size to make a heading instead of actually using a heading, that will affect your ranking. If you use tables and your content order makes little or no sense, that will affect your ranking. If you convey information through images and do not provide a textual alternative, that will affect your ranking. The errors on the invalid-HTML sites are relatively trivial syntax errors. *All* four sites are well structured. *All* four sites use CSS, rather than tables, for layout. *All* four sites have text for any image. A valid page is more than just what validates and what doesn't. There are many recommendations and requirements for a page to be correct that a mere syntax checker is not able to validate. The invalid-HTML Google study has followed all those recommendations and all those requirements that the validator does not check for.
That is a very interesting article. The experiment does seem to indicate that the google algorithm favours bad/invalid HTML - However, I think there would need to be a much bigger sample for conclusive results. Regardless of the implications for SEO I intend to continue with well formed code. T
Yes, I've had intense argument recently regarding this topic. The end result is: no, the robots will still come if you have hundreds of w3c non-compliane. Finally I could sleep in peace