In simple terms ..Google wont rank a site on some word just analyzing how many times a specific keyword u have used but also how many other words are on that page that relates to that word plus incoming and outgoing link relation to that word, means if i want to rank on car and i make a page on car ...google will take LSI approach by analyzing words like automobiles, vehicles on that page in the very same fashion google will analyze from where a link is comng to that page and what links are going outside from that page
Latent Semantic Indexing , a vector-space approach to conceptual information retrieval, is (quite) useful in situations where traditional lexical information retrieval approaches fail. LSI estimates the semantic content of the documents in a collection and uses that estimate to rank the documents in order of decreasing relevance to a user's query. But i'm not really sure that goog really uses LSI approach in its algorithm...
Oh good lord - last time someone mentioned LSI it practically started a war around here! Basically - there is no evidence that google is using LSI What is likely is that they are using a much more basic and crude factor in their algorithm that analyses some semantic indicators such as synonyms.
Well as long as you're not in the inner circle of Google, all you can do is guess. My experience after hundreds of split testing projects is that the LSI strategy is working the best. So that is the road I'm taking. And also, the semantic web is just around the corner so we better be prepared
I've had the same experience as fxdust. My testing has shown that sites built using LSI rank better for non-LSI sites. Until someone can prove otherwise, I'll stick with the LSI plan.