Something I have been looking at is cleaning up my site and at the same time it got me wondering if being WC3 compliant made any difference to my SEO efforts - it seems the answer is a big NO! And here for the reference, is where Google confirm it - Adam Lasnik is a Google SEO Strategist. Eric Enge: Let's talk about some other things that I think people are confused about. Let's talk about a scenario with a webpage that is built in some dynamic fashion and maybe there are two thousand lines of code, for what ultimately is a relatively simple page. And, the unique content to that page, the text and links and things that are really specific to that page are buried seventy percent or eighty percent of the way down in the file. Can this kind of thing hurt your rankings? Adam Lasnik: I have both bad news and good news in this area. The bad news is that every time you create a page that is this crufty, someone up there kills a kitten. There are so many great reasons to decruft your web pages. They will load faster, and in some cases a lot faster. It will likely improve your users'experience. And, as you can imagine, when the user's experience improves, it is not inconceivable that you will get more links to your site. With that said, we do not take that into account in our own indexing and ranking, unless it is so incredibly challenging for the Googlebot to follow. And, by that I actually mean really bad HTML with lots of accidentally open tags. But, I don't think that's what you are getting at. As I understand, you are talking about huge patches of JavaScript at the top, with the actual content of the page being pushed really far down in the file, and that is something that we really cannot look down upon, for the very same reason that we do not penalize or treat differently those pages whose HTML won't validate very nicely. With page validation, it's the same argument, that there are many great reasons to have your site validate, and to do validation checking. It can help your site, and could be more accessible to a lot of different people and browsers. But, here is the core problem why we cannot use this in our scoring algorithms currently: There are a ton of very high quality sites, pages and sites from universities, from research institutions, from very well respected ecommerce stores, of which I won't name any, that have really crufty sites, and sites that won't validate. On some of these you can view the source and cry. And, because this is quality content, we really can't use that as an effective signal in search quality. So, you can quote me a saying, I would be thrilled, it would make my day if people would decruft their sites, but it's not going to directly affect their Google ranking. So, there is a reason to do it and try to get it as clean as you can, but it will NOT impact your SEO efforts. Andy
I agree, having valid markup is not a huge factor when it comes to SEO and getting good SERP at Google. Never the less, I always make an attempt to insure my websites are WC3 compliant with error free markup code. In my opinion, webmasters should always do whatever it takes to make websites easy for search engine bots and spiders to crawl and interrupt, which includes having valid markup code.
W3C validation not make so much impact on ranking. but proper validation make easy crawling of website. it is good practice to make website W3C validate.
It is good practice and I agree you should try and keep your site as clean as possible, but the point here is that is has no impact on any SEO efforts at all, other than it should be done as best practice.
Agreed - W3C validation currently does not have any impact at all on SERPs. As long as your site can be crawled by the search engines you'll be fine. There has been talk recently (following the Las Vegas PubCon) that how quickly a webpage loads may be introduced as part of the ranking algorithm, although this factor is not expected to carry too much weight.
Search engine spiders skip the html, css, or javascript codes. So, the W3C validation is rather irrelevant to SEO. Having said that, it is a good practice to have W3C validated because it tends to keep your code cleaner and reduce the download time. A lot of purists pursue are crazy about W3C validation and that is not necessary.
People go on and on and on about on-site optimization. Simply following W3C recommendations will in fact cover 99% of on-site optimization - things like <hx> tags, <title> elements, etc. Understanding this, I don't see how people can argue against standards compliant markup. If you want the SE's to easily digest your pages what other approach are you gonna' use?
Google itself is not compliant with W3C recommendations. So having good contents overpowers the good markup.
Sorry but that is very wrong - 99%? What about the most important part of on-page, which is content? Google have already said being WC3 compliant doesn't matter!
Get a clue, man. Do you use markup around your content? Do you suppose that compliant markup is easier for spider's to digest and computers to derive meaning from? Or, do you think endless nested tables for layout, open tags, etc. works better. <h1>The Importance of Proper Markup</h1> <p>Amongst other things, <a href="http://www.w4c.org">proper markup</a> helps with"</p> <ol> <li>Cross browser rendering</li> <li>Accessibility</li> <li>Etc.</li> </ol> <p>Numbers one and two are extremely important when one pauses to consider that SE Bots are really nothing more than an alternate form of user agent - one used by a <strong>extremely stupid and concrete</strong> user (a computer)</p> <p><em>The Moral of the Story:</em> Proper (i.e. compliant) markup can only help the SE's in determining what your page is all about.<p> HTML: So, feel free to write crappy code if you so desire. I will see you at the top of the SERP's
If you read my other posts, you will see I said that I write clean sites and I agree that all should be - but to say that 99% of on-page optimisation can be covered by WC3 standards, sounds like something gleaned out of an e-book! So you are happy leaving 1% to your page content? What the hell is a search engine going to gain from visiting your site with no content? Yup - you are right, you will see me at the top of the SERP's!
This is an inane question. The real question is, "What are you going to wrap your content in?" Or, are you one of those quacks that aims for specific keyword densities, html to text ratios and other relics that hark all the way back to AltaVista (if you even remember that far back)? Here's the clue. 1. Write for your visitor, not the SE's. Remember, on the web users don't read, they scan. Make the important stuff stand out with headings, lists and bolded statements. 2. Use markup that meets recommended standards. 3. Separate content from presentation. 4. Give up the glitter. Remember, form follows function. if you don't know the function of your site, start over from scratch. You follow those 4 easy guidelines and the SE's will know exactly what the page is about. Ignore them at your own peril... To assert that, "it will NOT impact your SEO efforts," is, in a word, clueless. All things being equal (they never are) the site that is standards compliant will rank better than the one that isn't.
Hehe it is clear you don't really know what you are on about when it comes to search engine marketing, which I have been doing for more than 9 years now, and as such I refuse to be baited by someone who is clearly here just to try and pick arguments - very school playground! For myself, I have more than 9,000 1st page Google pages under my belt so I do know what I am talking about. I did try to look at your site but I notice it isn't working
And can I just point out that Google have also confirmed this is not the case - the internet is littered with sites that are badly written but get that coveted 1st place, above those sites that are well written but with poor / no real content. If you have not seen these, then I really suggest you do a little searching and see for yourself.
IMO, WC3 compliant may have some effect because good site structure and navigation help crawler to index easily the information on your website.
Perhaps do a test. Take a page that has been holding tight on Google and make it fail. We'd all stroke if it went to 1
As long as the links can be crawled, then Google will pay little or no attention to a bad design. They have confirmed this. If the content can be read, it could get indexed (never any guarantee here though). The WC3 aspect comes into play because you should have a well designed and laid out site, and this bit I agree 100% with - this is how I build my sites! I shall give you an example, one of my customers has a directory that throws up more than 250 WC3 errors on the front page alone, yet they rank 1st page, 1st place for hundreds of well contested phrases. We all know (well, most will), that Google has more than 200 metrics that it uses when looking at a site and how to rank it. We don't know what most of these are so perhaps a well laid out site that is WC3 compliant makes a difference, but if it does, it will be a very small portion of the over all score.
It's unnecessary to comply to W3C standards to achieve good rankings - Matt Cutts says so himself. Hope this helps
The fact that one can rank well with lousy markup does not mean that one can not rank better with standards compliant markup. Further, if you go back to my first post on this you will see I stated: I have to wonder how many here have ever read W3C recommendations or even fully understand what I said. Title Element H Tags Document Structure Following W3C recommendations means a whole lot more than running a page through a validator. It suggests a deeper understanding of semantics and structure than some posters in this thread seem to have.