Hello fellow Digital Point Forums, I'm new around and haven't introduced myself yet, but my name is Rob and I'm from UK. The reason I'm asking here, is because the smaller forums are not very interested in this, and can't share almost no information with me about this. What I'm referring to is the semantic future of the search engines, in this case Google as a major and famous search engine. I'm recently reading more and more people talking about this here and there, however no major threads about it. The semantic web will be about making the search engines think more like a human, and understanding the content easely without the help of titles, meta tags, attributes etc, like a human being would understand it. I'm actually a bit scared of this, seems like the old SEO ways to optimize your content will be gone, just like the one that were used in the 90s for those who remember (blank spaces filled with white text, keyword spamming, the time meta tags really mattered etc). Now we are in some.. maniacal state of building back-links, and maybe slightly improving our code without thinking much of the content. I would ask anyone who has heard anything about this semantic future, and the optimization of websites via the Latent Semantic Optimization model that I read about, anyone who has info to tell me and explain me more, cause I'm more or less interested. General content optimization tips are welcome too, thank you. Rob.
LSI is just a way for search engined to check how good your content is, if you write good content you should have no problems. It's a way for search engines to rank websites based on content quality rather than assumed content quality based on backlinks.
But there is where Latent Semantic Optimization hits, many people as live-cms says you won't have problems if you write good content, but what really is good content? That's what the Latent Semantic Optimization model is working on, optimizing your content in a way, that is both friendly for your visitors, and double friendly for the semantic search engines of the new generation. However, by using LSI of the Second Generation, as it is called, the search engines doesn't end their ranking with this, there are more things which still matters: internal page rank, backlinks, code, domains and almost everything else you know from the "general" (excuse me) SEO. However, the content has 40-60% importance for your page ranking, and I bet this number will raise with the time. It's time for Google to stop this mess of manipulating the search engines with a whole science!
This is a very common misconception. The Semantic Web does not deduce meaning from the top down. From the The World Wide Web Consortium: What the Semantic Web technologies do is to define the “language†with well understood rules and internal semantics, ie, RDF Schemas, various dialects of OWL, or SKOS. Which of those formalisms are used (if any) and what is “expressed†in those language is entirely up to the applications. Ontologies may be developed by small communities, from “belowâ€, so to say, and shared with other communities. Here you go: What is Latent Semantic Indexing (LSI)? The LSI Myth - Michael
Well I would say you're wrong, and you just googled two articles corresponding LSI. Why don't you spend 10 minutes to read this article. Which is quite more advanced than the LSI Myth one and explaining the LSI LSA methods in a full user-friendly way + additional advanced way. The method search engines are using is a modified version of LSI, it's not what you know from the 90s. LSI of the Second Generation as they call it. What about those conferences been made about Latent Semantic Optimization? In Paris? And the last month in Germany? The LSI model of the second generation is based on Latent Semantic Indexing including double and triple words after a selection based on the Content Syntax Index. The model also includes global and local term weighting.
Actually people can adopt to different environment...We'll manage with it. As to content I'd like to say that a big amount of sites don't care about the content, operating just with those tools that make them to get higher PR and optimize the sites as soon as possible. From the one hand it seems to be the fastest method to promote the sites from the other many sites are filled with rubbish. Implementing Latent Semantic Optimization means cleansing the Internet...
Yes, semantic could be the future of Google, as it may become other search engine's future as well. However, there are too many bugs and errors in semantic current technology so no wonder why Google still use back link policy to rank websites. Five years ago (if not more) many people already believed Google algorithm has been manipulated heavily by spam websites so they created semantic search engines like hakia but even until today none of these search engines made any promising movements to grab attention from people. I believe semantic could become the core of search engines but it still needs improvement here and there. It may took a long time for people to start using semantic search engines (or for Google to dump its back link policy and fully trust semantic technology to rank websites)
Glad we are back to discussing it: I wrote an article about it long ago - you may read my ideas on Semantic Future of Google
Website always going to be optimized for Visitors not for the search Engines, whether it's conventional way or semantic technology. Visitors and Traffic is the aim, And i am sure every search engines and than visitors will accept new technology if it's better.