Google's future as a semantic search engine.

Discussion in 'Google' started by Rabb, Apr 10, 2008.

  1. #1
    Hello fellow Digital Point Forums, I'm new around and haven't introduced myself yet, but my name is Rob and I'm from UK. The reason I'm asking here, is because the smaller forums are not very interested in this, and can't share almost no information with me about this.

    What I'm referring to is the semantic future of the search engines, in this case Google as a major and famous search engine. I'm recently reading more and more people talking about this here and there, however no major threads about it. The semantic web will be about making the search engines think more like a human, and understanding the content easely without the help of titles, meta tags, attributes etc, like a human being would understand it.

    I'm actually a bit scared of this, seems like the old SEO ways to optimize your content will be gone, just like the one that were used in the 90s for those who remember (blank spaces filled with white text, keyword spamming, the time meta tags really mattered etc). Now we are in some.. maniacal state of building back-links, and maybe slightly improving our code without thinking much of the content.

    I would ask anyone who has heard anything about this semantic future, and the optimization of websites via the Latent Semantic Optimization model that I read about, anyone who has info to tell me and explain me more, cause I'm more or less interested. General content optimization tips are welcome too, thank you.

    Rob.
     
    Rabb, Apr 10, 2008 IP
  2. live-cms_com

    live-cms_com Notable Member

    Messages:
    3,128
    Likes Received:
    112
    Best Answers:
    0
    Trophy Points:
    205
    Digital Goods:
    1
    #2
    LSI is just a way for search engined to check how good your content is, if you write good content you should have no problems.

    It's a way for search engines to rank websites based on content quality rather than assumed content quality based on backlinks.
     
    live-cms_com, Apr 10, 2008 IP
  3. icetrax

    icetrax Peon

    Messages:
    162
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #3
    But there is where Latent Semantic Optimization hits, many people as live-cms says you won't have problems if you write good content, but what really is good content? That's what the Latent Semantic Optimization model is working on, optimizing your content in a way, that is both friendly for your visitors, and double friendly for the semantic search engines of the new generation.

    However, by using LSI of the Second Generation, as it is called, the search engines doesn't end their ranking with this, there are more things which still matters: internal page rank, backlinks, code, domains and almost everything else you know from the "general" (excuse me) SEO. However, the content has 40-60% importance for your page ranking, and I bet this number will raise with the time. It's time for Google to stop this mess of manipulating the search engines with a whole science!
     
    icetrax, Apr 10, 2008 IP
  4. Michael

    Michael Raider

    Messages:
    677
    Likes Received:
    92
    Best Answers:
    0
    Trophy Points:
    150
    #4
    This is a very common misconception. The Semantic Web does not deduce meaning from the top down.

    From the The World Wide Web Consortium: What the Semantic Web technologies do is to define the “language” with well understood rules and internal semantics, ie, RDF Schemas, various dialects of OWL, or SKOS. Which of those formalisms are used (if any) and what is “expressed” in those language is entirely up to the applications. Ontologies may be developed by small communities, from “below”, so to say, and shared with other communities.

    Here you go:

    What is Latent Semantic Indexing (LSI)?

    The LSI Myth

    - Michael

     
    Michael, Apr 11, 2008 IP
  5. icetrax

    icetrax Peon

    Messages:
    162
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Well I would say you're wrong, and you just googled two articles corresponding LSI. Why don't you spend 10 minutes to read this article. Which is quite more advanced than the LSI Myth one and explaining the LSI LSA methods in a full user-friendly way + additional advanced way.

    The method search engines are using is a modified version of LSI, it's not what you know from the 90s. LSI of the Second Generation as they call it. What about those conferences been made about Latent Semantic Optimization? In Paris? And the last month in Germany? The LSI model of the second generation is based on Latent Semantic Indexing including double and triple words after a selection based on the Content Syntax Index. The model also includes global and local term weighting.
     
    icetrax, Apr 12, 2008 IP
  6. jv17

    jv17 Guest

    Messages:
    24
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    i agree with that...
     
    jv17, Apr 14, 2008 IP
  7. Sam 735

    Sam 735 Well-Known Member

    Messages:
    990
    Likes Received:
    9
    Best Answers:
    0
    Trophy Points:
    108
    #7
    Actually people can adopt to different environment...We'll manage with it. As to content I'd like to say that a big amount of sites don't care about the content, operating just with those tools that make them to get higher PR and optimize the sites as soon as possible. From the one hand it seems to be the fastest method to promote the sites from the other many sites are filled with rubbish. Implementing Latent Semantic Optimization means cleansing the Internet...
     
    Sam 735, Apr 14, 2008 IP
  8. wokaka

    wokaka Peon

    Messages:
    2,346
    Likes Received:
    44
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Yes, semantic could be the future of Google, as it may become other search engine's future as well. However, there are too many bugs and errors in semantic current technology so no wonder why Google still use back link policy to rank websites. Five years ago (if not more) many people already believed Google algorithm has been manipulated heavily by spam websites so they created semantic search engines like hakia but even until today none of these search engines made any promising movements to grab attention from people. I believe semantic could become the core of search engines but it still needs improvement here and there. It may took a long time for people to start using semantic search engines (or for Google to dump its back link policy and fully trust semantic technology to rank websites)
     
    wokaka, Apr 14, 2008 IP
  9. arvind15290

    arvind15290 Banned

    Messages:
    32
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    i too agree wit that
     
    arvind15290, Apr 15, 2008 IP
  10. SEO_WatchDog

    SEO_WatchDog Well-Known Member

    Messages:
    1,148
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    115
    #10
    Glad we are back to discussing it: I wrote an article about it long ago - you may read my ideas on Semantic Future of Google
     
    SEO_WatchDog, Apr 15, 2008 IP
  11. chaitanya.seo

    chaitanya.seo Banned

    Messages:
    625
    Likes Received:
    12
    Best Answers:
    0
    Trophy Points:
    0
    #11
    Website always going to be optimized for Visitors not for the search Engines, whether it's conventional way or semantic technology.
    Visitors and Traffic is the aim,
    And i am sure every search engines and than visitors will accept new technology if it's better.
     
    chaitanya.seo, Apr 15, 2008 IP