Who (really) understands the Google Algorithm?

Discussion in 'Google' started by Lyn, Aug 28, 2008.

  1. #1
    According to Google:

    "The software behind our search technology conducts a series of simultaneous calculations requiring only a fraction of a second. Traditional search engines rely heavily on how often a word appears on a web page. We use more than 200 signals, including our patented PageRankâ„¢ algorithm, to examine the entire link structure of the web and determine which pages are most important.

    QUESTION: We have discussed about links to relevant reputable sites, and a few more SEO optimization issues. Which are these 200? Can we compile a good list, to have the best possible understanding?


    According to Google:

    Hypertext-Matching Analysis: Our search engine also analyzes page content.
    However, instead of simply scanning for page-based text (which can be manipulated by site publishers through meta-tags), our technology analyzes the full content of a page and factors in fonts, subdivisions and the precise location of each word.
    We also analyze the content of neighboring web pages to ensure the results returned are the most relevant to a user's query.

    QUESTION: Do you understand the issue with the "neighboring" pages? What about one-page web sites? Do you understand what they mean when they speak about the "precise (!) location of each word?

    Thank you for your time.
     
    Lyn, Aug 28, 2008 IP
  2. Camay123

    Camay123 Well-Known Member

    Messages:
    3,423
    Likes Received:
    86
    Best Answers:
    0
    Trophy Points:
    160
    #2
    Meaning that if in the first paragraph of your content you talk about cat food, their is a chance your site is about cat food. More then if cat food appears as the lst words of your content only.
     
    Camay123, Aug 28, 2008 IP
    microbrain likes this.
  3. magda

    magda Notable Member

    Messages:
    5,197
    Likes Received:
    315
    Best Answers:
    0
    Trophy Points:
    280
    #3
    I think the neighbouring pages issues is a question of an evaluation as to whether this page on your site is a better match to the search query than that page, bearing in mind that a website that covers a particular subject is likely to have keywords and phrases repeated throughout the site, particularly in term of text links etc.
     
    magda, Aug 28, 2008 IP
    microbrain likes this.
  4. lycos

    lycos Well-Known Member

    Messages:
    3,769
    Likes Received:
    176
    Best Answers:
    0
    Trophy Points:
    185
    #4
    I would say it means other pages from your site pointing to that page. Kind of providing a reference in case the visitors want to find out more info. However, I can't say much about the precise location. Maybe it analyzes how the keywords get spread around the article and distributed.
     
    lycos, Aug 28, 2008 IP
  5. googlefrog

    googlefrog Peon

    Messages:
    145
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #5
    200? Thought it was 100
     
    googlefrog, Aug 28, 2008 IP
  6. freelistfool

    freelistfool Peon

    Messages:
    1,801
    Likes Received:
    101
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Neighboring pages is about "themes". Google tries to determine the theme of your website by the co-occurrence of words on each page and the neighboring pages. For example, if your site uses the term "white house" google expects to also see the term "president" or "united states" either on the page or on a neighboring page. Having the related words clustered together in nearby pages strengthens the theme of your site and helps you rank for the term "white house".
     
    freelistfool, Aug 28, 2008 IP
    microbrain likes this.
  7. angilina

    angilina Notable Member

    Messages:
    7,824
    Likes Received:
    186
    Best Answers:
    0
    Trophy Points:
    260
    #7
    "What about one-page web sites?"

    I think one page sites are not liked by google. Google love content where in one page sites, there is not much content, these sites are just like sales page.
     
    angilina, Aug 28, 2008 IP
  8. lightlysalted

    lightlysalted Active Member

    Messages:
    2,067
    Likes Received:
    32
    Best Answers:
    0
    Trophy Points:
    90
    #8
    it's been known that Google has ignored keyword stuffing for years, but their formula seems to change regularly. The appraoch i adopt is to create webpages for visitors not for search engines.
     
    lightlysalted, Aug 28, 2008 IP
  9. SEOibiza

    SEOibiza Peon

    Messages:
    1,197
    Likes Received:
    43
    Best Answers:
    0
    Trophy Points:
    0
    #9
    yea neighbouring pages is about building themed groups and the internal linking from each page to each other.

    internal linking is one of the really key factors in ranking these days IMO, seems to be more than ever, you can really make it or break it with this
     
    SEOibiza, Aug 28, 2008 IP
  10. AngelaE8654

    AngelaE8654 Active Member

    Messages:
    935
    Likes Received:
    27
    Best Answers:
    0
    Trophy Points:
    85
    #10
    Exactly. When I read the thread title, the first thing that came to my mind is "no one". Believe me, it's not only US that is trying to uncover the secrets to Google's algorithm. All of their competitors are, too.

    We will never really know what "Google likes" and what "Google doesn't like"; we can only look at what's happening with other sites and use that information the best way we know how. It's so much better to design our webpages for our visitors; word-of-mouth is one of the best advertising techniques "out there".
     
    AngelaE8654, Aug 28, 2008 IP
  11. simstar

    simstar Peon

    Messages:
    467
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I think its best to design it for the search engines primarily but ensure visitors are satisfied. The search engines will eventually rank pages accurately according to visitor satisfaction but at the moment its an algorithm which is not entirely based on this. Its getting there but I believe a site made for google will still rank higher than a site for visitors, over time adjust it to try an match googles algorithm as it improves until we reach the point where all the sites on the internet are made for users/visitors.
     
    simstar, Aug 28, 2008 IP
  12. MRniceGuy007

    MRniceGuy007 Peon

    Messages:
    446
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #12
    200 is just an estimate, this press release they let out is already outdated.

    yes 300 is a movie
     
    MRniceGuy007, Aug 28, 2008 IP
  13. Lyn

    Lyn Peon

    Messages:
    80
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #13
    Thank you very much for all your answers. This is a really difficult topic.
     
    Lyn, Aug 28, 2008 IP
  14. yenerich

    yenerich Active Member

    Messages:
    697
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    75
    #14
    neighbouring pages is about detecting if a site is dedicated to that thing that the user are searching.

    Maybve the user search for "mp3 players" and one site has this word on their text, but the site is not dedicated to software or players, etc. Maybe another site with the word in the firts page has as its main subject the music players, so this second site is probably more relevant.
     
    yenerich, Aug 28, 2008 IP
  15. AngelaE8654

    AngelaE8654 Active Member

    Messages:
    935
    Likes Received:
    27
    Best Answers:
    0
    Trophy Points:
    85
    #15

    Any time you get into specifics of Google's algorithm, the answers you will get are educated guesses at best. No one really knows, except maybe Matt Cutts and he's not talking.
     
    AngelaE8654, Aug 28, 2008 IP
  16. rekjl

    rekjl Peon

    Messages:
    142
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #16
    Yeah, try building your site specifically for the visitors, not the search engines. If the visitors like what they see and stay longer, Google and all the other search engines will notice that. It will help you a lot. Remember, they are in the search business. That means that they want to give their visitors the best result possible. Seo ain't gonna do that! Your content will!
     
    rekjl, Aug 28, 2008 IP
  17. alemcherry

    alemcherry Guest

    Best Answers:
    0
    #17
    Getting high ranking on google SERP is not about understanding the complexities of google algorith, it is all about doing the well known seo techniques right. If you can write high quality content and get lot of organic links pointing to your site, you will inevitably rank pertty well. Too much of digging deep into the minor things is not really helpful, IMHO.
     
    alemcherry, Aug 29, 2008 IP
  18. microbrain

    microbrain Banned

    Messages:
    1,079
    Likes Received:
    100
    Best Answers:
    0
    Trophy Points:
    0
    #18
    It's not a matter of Like/dislike - Google like all the new and innovative things !
    If you have a site with hundreds of thousand pages & the content is not that good then it'll not give any importance to that content !

    I got a site with only one page , Just a 200 words content on it and it's having PR4 with Position in Google serps ..
     
    microbrain, Aug 29, 2008 IP
  19. rehash

    rehash Well-Known Member

    Messages:
    1,502
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    150
    #19
    google makes changes in algo almost daily, so even if one understands it now, after 1 month, things can be different
     
    rehash, Aug 29, 2008 IP
  20. simstar

    simstar Peon

    Messages:
    467
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #20
    That is true, although the core structure I believe would remain the same. The search results generally move about but good sites stay up at the top of SERPs
     
    simstar, Aug 29, 2008 IP