LSI - Latent Semantic Indexing Before LSI Was Implemented When user types a keyword for searching, the search engine searches through its collection of document for the typed keyword. It checks for the exact typed keyword in all the documents and returns only those documents which contained it and ranks it on some ranking algorithm. However there are two fundamental problems in natural language processing : synonymy and polysemy . In synonymy, different writers use different words to describe the same idea. Thus, a person issuing a query in a search engine may use a different word than appears in a document, and may not retrieve the document. In polysemy, the same word can have multiple meanings, so a searcher can get unwanted documents with the alternate meanings. To address this important issue of presenting only relevant search results, Latent Semantic Indexing or LSI was proposed as a new improved method of retrieval system. Knowledge on LSI or what we normally call search result algorithm is extremely important for webmasters Why Knowledge On LSI Is Important Latent semantic indexing helps search engines to find out what a web page is all about. It basically means to you that: You shouldn't focus on a single keyword when optimizing your web pages and when getting links You should focus on content optimization with your relevant keywords and keyword phrases Since LSI incorporates both aspect of synonymy and polysemy, you should also bear that in mind when targetting your keywords and content rebuilding
Despite the obvious attempt at self promotion, I will say that most of what you're trying to say here is common knowledge to anyone with even a rudimentary grasp of semantic markup, Web copywriting and on-site search engine optimization. And as an interesting aside, before keywords were used, search engines actually used to search for the name of the file. But that's pretty much pre-1995 anyway.
I think freddie is fishing for some work. In speaking with people who know much more about SEO than I do, LSI is regarded more as a myth by real SEO pros. If Google was so advanced to implement true LSI, there would be no need to have webmasters report on each other for paid linking or anything else. Google would know who was doing what on their sites. There is still keyword spam and hidden text on websites ranking all across the web. This tells me that there is no real LSI in place.
Yup. it's not a myth, I rank very well and outrank hundreds of sites fast, thanks to LSI. By the way, kwbrowse.com is a tool you should use.
The Google (PR) toolbar uses logarithm scale with an unknown base to assign each page a number from 0 to 10. Various SEO experts have proposed this number to be anywhere from the value of the natural logarithm base e to number 10, including 3, 6, and 8. Knowing accurate value of this base number would be very helpful in Search Engine Optimization. This experiment is designed to find out what is the base of this logarithm. You can check at your own risk!
Hi Dan Schulz, I posted the content on LSI because I think its important as it can help to get better SERP. You may try to look at my thread at Search Engine Optimization about "Thematic Optimization" which I briefly explain on the usage of LSI for SERP Regards.