its a google technique that allows search engine to know what a page is about outside of specifically matching search query text. seobook.com/archives/000657.shtml This link will help you to understand more about LSI
LSI is also an application of Correspondence Analysis, a multivariate statistical technique developed by Jean-Paul Benzécri[2] in the early 1970s, to a Contingency Table built from word counts in documents.
LSI is a system used by Google and other major search engines. The contents of a webpage are crawled by a search engine and the most common words and phrases are collated and identified as the keywords for the page. LSI looks for synonyms
LSI is used for indexing the text. For SEO, LSI refers to the analysis of text as part of ranking algorithms. LSI is used by Search Engine to measure quality and relevancy. With the help of LSI Search Engine determine whether the text is naturaly generated or artificially.
LSI is extracting relationships between the terms and concepts contained in an unstructured collection of text(used in Software and mathematical process of hardcore stages)..I hope you are asking about Latent Semantic Analysis (LSA) which is used by GOOGLE...Its an algorithm developed by google for making user friendly platform by understanding small variations in word which they are searching..Can have a small blog for this.. copperbridgemedia.com/blog/
No search engine uses LSI. They don't have the capacity to algorithmically map the relationship of every word to every other word. What they do use is use cruder semantic matching capabilities, which evaluate whether words are being used in an expected relevant context.
LSI popped up a few years ago and has been used incorrectly ever since. Google determines relationships between words by their frequency and placement on a web page. It is more correct to say that Google "themes" websites based on relationships and relevancy, but actual latent semantic indexing of billions of pages would take more computer power than currently exists on earth.
*clasps to bosom* - somebody who talks sense! But it's the people who post incoherant incorrect rubbish that get the thanks
Latent semantic analysis (LSA) is a technique in natural language processing, in particular in vectorial semantics, of analyzing relationships between a set of documents and the terms they contain by producing a set of concepts related to the documents and terms.
Interesting. I never thought of it this way, but it makes sense if one has to analyse billions of webpages. Hugo Guzman published a very interesting post on this topic, claiming that Google does not use LSI. I recommend you read it.
Pretty much that in a nutshell. It would be like if I had a page about "Building Muscle", and yet I still ranked for "Gaining Muscle". The two phrases are pretty much identical to me or you, but back in the day search engines did not have a "thesaurus" per-say to check the meanings behind each words.
LSI : Latent Semantic Indexing To cut it short Google will 'understand' content by finding relationships between words and phrases. Search online and you`ll find some detailed info on this and how it's done. Many PDF files on the subject from some cool Universities.