KEYWORD DENSITY When optimizate a page content you care more about the keyword density or about the number of repeted keywords. What's counting more?! Like... a website can have for KEYWORD A1 a density of 0.1% but the document to be very big... so the keyword can be repeted 500 times.. in another document KEYWORDS A1 to have a density of 5% , but repeted only 5 times... Also google will pay more attention if the keywords are places one each other like "A1 was first created in 1985. A1 inventor was John. The A1 project has been sold out in 2007 to George, that created the A1 Company.", or if the keyword is spreeded all over the document?!
Hey dOOd - no one uses KW density any longer.... start looking into Latent Semenatic Indexing (Analysis) and Phrase based optimization This is the way of things these days - KW Density has been fazed out since around 2002-03 Just thought I'd mention that 4 ya - Peace
Wel... backlinks without content Look , just do this SEO experiment, remove all your content from your website, and see how high you will begun to rank.
What R U talking about M8 - U lost me there.... Did you READ the links? It is ALL about content... just not constructing simplified spammy KW density content. ...am I missing something?
I use to worry a lot about keyword density but no longer. I find it produces text that is often not of the quality as it would be when its written naturally.
Consider keyword density to avoid getting filtered for overoptimization. Insert most important keywords as higher as possible.
O come on... based on what technical documents? Where are ya getting this? Google and other SEs are using phrase based IR these days and I can give ya data on that (already posted 2 links) So where is this KW Density stuff? None of the current indexing and retrieval technologies I read up on have ANYTHING to do with KW Density - so I have no idea why folks prattle on about it. All the evidence shows that SEs have moved away from such simplistic processes.... Oh well... happy KW stuffing
Man , for big websites...for portal sites... content is still the king,ok?! I don't known how it's for small sites..etc.. but for my experience, if you get a lot of links to your main page, it will pass enough "rank power" to all your subpages... and then what's counting more?! the content, no?! Take a look at wikipedia, they rank very high for lot's of keywords, why?! Common Google is smart enought to rank "websites" and not pages, as most of you think. It's true that if you optimizated in-pages with backlinks, they can competite... but... in general ... the sub-category pages, from very big websites are ranking very high. Much more, google/yahoo will usually want to rank your in-deep pages, then front-page, even if the front-page is full with backlinks, and in deep pages have no backlinks to them ... A website is like a book , the main page (index.html) it's the cover... then it got categories, sub-categories and content... Google/Yahoo will hunt the content, not the cover...
keywords and key phrases are probably same things, the difference is in quantity of separate words, right? What will happen to your page if you will use your keyword, or keyphrase too many times despite of variations? And besides, phrase based optimization meant in second article does not excludes filters that penalize for overoptimization, does it? Explain your opinion, please. I think I'm missing the point, or we are misunderstanding each other.
I think thegypsy is saying that "LSA/LSI" and "Phrase based optimization" are used to "count" the keywords/phrases; not simple percentage. S/he is not disagreeing about content, just that search engines "count" keywords/phrases in a more complex way, then just percentage. (Yes, I know there is much more to LSA and Phrase Based Optimization then count...)
Wow - thanks... U rock Precisely - I am talking about how we approach issues dealing with content and targeting. We do not think in simple terms of KW density from the days of Yore. Search technologies have evolved as have search user behavior ( and how the query) Think more in terms of phrasing, relevance and occurrences – than simply KW density. As far as this; Sounds like something U read somewhere – not a technical aspect of the alogs. To begin with Penalties and Filters are 2 different beasts. Second there is no ‘over optimization’ calculations or inferences in any technical search engineering documentation I have come across. If you can refer me to some, I shall consider the concept U will find I do not subscribe to many so-called ‘common’ SEO terminology and theories (can U say ‘sandbox’) – if there is scarce documentation or other evidence, I don’t consider it. I live in a world of technical search engineering aspects and hands on experience.. trendy terms and baseless theories mean little to me I ain’t busting anyone’s chops, just letting U know I am not necessarily going along with every illogical statement I run into
Thanks for taking time to make things clear for me, I appreciate your opinion but still disagree with some statements. Just tell me why do you dislike term "overoptimization" and what word would you use to describe reasons of all penalties instead of it?
I just prefer definative terms.. OOP is like the 'sandbox' there are way to many 'opinions' on what it means. So I take it out of my lexicon and move on. I just hate SEO terms that are moving targets - hard to work that way. I have no problem agreeing with the concept of what folks call OOP, I just don't subscribe to the 'terminology' that's all... In truth, there are not that many penaties out there.. far more filters affecting folks than actual penalties. So what OOP do you believe there are? (OOP = Over Optimization Penalty for those wondering)
Try to put Original Content , do not Look at Keyword density when writing , do not put keywords that does not go with the content.