Was reading an article on Squidoo and came across this. 10. Have a high content-to-code ratio Your pages should have a high content-to-code ratio, also known as a high signal-to-noise ratio. The author goes on to say that search engines are not fond of pages that have more code than content and that they've implemented a specific ratio in their algos to look for such things. Never heard of this before... sounds like another urban legend. Anybody have any confirmation and/or proof?
It's classic seo - if the bot has to crawl through loads of code to get to the content, it underestimates the importance and relevance of the content. There's no urban legend about it - having clean code should be one of the top elements on anyone's seo check-list.
as said above NO urban legend - simple reality no or less gadgets but quality unique content is the winner fast loading pages - fast server etc
It IS legend, imho. Google doesn't care even if you have W3C non-valid site that messes all HTML around. Why whould it care of the percentages of code in a source? Have you tested this?
This theory or fact in this case has been around for a long time now. It is true and it is kind of common sense if you think about it
I think it relates to how fast a bot/spider can crawl your site... if it's got to take a long time just to index a page, then it's not going to be something they'll be happy with. It's something to majorly consider
It is true because.... (put your arguments in) Have you done any experiments on this? My common sense says Google is smart enough to not put a site down if it uses (for example) a rubbish table layout instead of a modern CSS. If they both provide relevant content, they both will rank well, not matter how many percents the code takes. Open Yahoo homepage. It contains about 150k of code and only 3k of pure text = 2% of text and 98% of code. Quite a bad ratio, isn't it?