1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Can you optimize for the behavioral search algo?

Discussion in 'Search Engine Optimization' started by Maximizationator, Feb 18, 2009.

  1. #1
    We know about standard on-page and off-page SEO.

    Google has started to personalize iGoogle search results - I'm not going to get into that right now... I am going to explore "click through ratio" & "bounce rate" as it may pertain to rankings.

    [​IMG]

    This "heat map" is actually an eye-tracking study, but it clearly show that the higher ranking results will inherently receive more clicks.

    Assume this CTR distribution for example:

    Position 1 = 30% CTR (click through ratio)
    Pos 2 = 20% CTR
    Pos 3 = 7% CTR
    Pos 4 = 4% CTR
    Pos 5 = ...
    .
    .
    .etc.

    Google has an idea of what these CTR distributions should be for at least semi-popular search phrases. They've got loads of CTR data.

    Assume the following visual graph for the above mentioned "normal" CTR distribution:

    Pos 1 = ----------------------
    Pos 2 = -------------
    Pos 3 = ------
    Pos 4 = ----
    Pos 5 = ---

    If, in reality, Google observes a CTR distribution that deviates from the norm like such...

    Pos 1 = ----------------------
    Pos 2 = -------------
    Pos 3 = ------
    Pos 4 = ----------
    Pos 5 = ---

    ... then website in position 4 would be likely to move up to the 3rd spot, based on CTR behavior, as searchers have implied this title & description to be most relevant (or enticing, or a "call to action") to their search phrase.

    However, if position 4 gains these clicks using a misleading title or description (website doesn't match description, website sucks, etc), searchers are likely to hit the back button (contributing to site 4's "bounce rate"), and they may click on another result. Google can track that ultimate destination as well.

    Here is a hypothetical "normal" bounce rate distribution (searcher hits the back button):

    Pos 1 = --
    Pos 2 = ---
    Pos 3 = ----
    Pos 4 = -----
    Pos 5 = ------

    Let's say the actual bounce rate distribution for "generic keyword" looks like:

    Pos 1 = --
    Pos 2 = ---
    Pos 3 = ------------
    Pos 4 = ----
    Pos 5 = -----

    Website in position 3 must have gotten there using both on & off-page seo. But with an abnormally high bounce rate, it is likely to get demoted. It isn't "sticky".

    Search behavior on pages 2 and beyond are likely to have different behavioral characteristics. By page 2, it is obvious that the searcher hasn't found what he or she is looking for and may be scanning results more thoroughly. Page 2 behavior may be weighed more heavily as such searchers are no longer casually clicking. They're reviewing. Speculation on my part, of course...

    With behavioral or any data set, Google and other SE's must consider it's reliability: Is there enough data? Can it be manipulated? Is the data noisy? Is it truly meaningful?

    Can you optimize for the behavioral search algo? The answer is, if it exists, yes.

    Even if behavior didn't affect rankings, you should strive for high CTR and low "bounce rate." The question is how?

    Please offer suggestions. I've got some in mind.
     
    Maximizationator, Feb 18, 2009 IP
    Valley likes this.
  2. vansterdam

    vansterdam Notable Member

    Messages:
    3,145
    Likes Received:
    120
    Best Answers:
    0
    Trophy Points:
    245
    #2
    Of course you can optimize for this. It just isn't the standard search engine optimization that most of us are used to.

    You need to write compelling page titles and meta descriptions that attract more clicks. If you are writing just to fit keywords in, your CTR could suffer. The same applies if your title and description are very similar to your competitors.

    You also need to improve your page content to entice users to click deeper into your website. Instead of putting everything on one page, you can spread content out over multiple pages. This way a user has to click to the next page to see more of what they are interested in. You can also lead users to related pages with good cross selling. An eye catching promotional graphic might drastically reduce bounce rates too.

    In an ideal scenario Google would be using this kind of info for ranking websites. I don't think they are quite there yet. Before they jump into something like that they will have to do a lot of testing to ensure that it is not easily manipulated by spammers.

    Really Google may lose ad revenue if they push all the most clicked results to the top rankings. If people always find what they want in the first few results, they no longer have the need to click on the paid ads. Eventually people would just completely ignore the sponsored results.
     
    vansterdam, Feb 18, 2009 IP
  3. Valley

    Valley Peon

    Messages:
    1,820
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    0
    #3

    Fantastic post.
    The best I have seen all year.
    Very overlooked.
    Basically the F shape. I thought more would have clicked on #10 though and
    i doubt whether this was a difficult keyword.
    If it was there are normally 6 sub searches roughly below 10 so this pattern isnt quite accurate
     
    Valley, Feb 18, 2009 IP
  4. vansterdam

    vansterdam Notable Member

    Messages:
    3,145
    Likes Received:
    120
    Best Answers:
    0
    Trophy Points:
    245
    #4
    Don't give the OP credit for those heat map graphics. I saw those in an eye-tracking study a long time ago. Those graphics just represent where people's eyes are normally drawn. I think they just used a sample of 50 college students. It does not represent the distribution of clicks. Any website on page #1 for a high volume traffic will get some clicks.
     
    vansterdam, Feb 18, 2009 IP
  5. Maximizationator

    Maximizationator Peon

    Messages:
    217
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #5
    If you've ever used Google Adwords, you would know about something Google calls "Quality Score." This is a number from 0 - 10 that characterizes the "quality" of your ad and landing page (which, for Google, ultimately translates to profitability). The lower your quality score, the higher your required CPC bid, and visa versa. When Adwords first came out, the only component to the Quality Score was CTR. Since then, Google has added a few more aspects like relevance, load time, bounce rate, visit duration... but CTR is still the main component of "Quality Score." The simplified idea is to place ads with better overall CTR performance near the top so that Google can make maximize their profit for every searched term, while providing advertisers incentive to create quality ads & landing pages through lowering their CPC.

    Impressions * SUM[CTR(R) * CPC(R)] = Keyword Profit, where R = Ad Rank

    So I have strong reason to believe that Google applies a similar behavioral algorithm to their organic SERPs. If they are already providing personalized search results, of course they are utilizing behavioral data on the universal scale... and the argument that Google would return poor organic results so that more people click on ads is rather dubious. If their search quality were to suffer, people would migrate to a different search engine. They could... if an ad and an organic result were identical, but they won't. It's unethical.
     
    Maximizationator, Feb 18, 2009 IP
  6. vansterdam

    vansterdam Notable Member

    Messages:
    3,145
    Likes Received:
    120
    Best Answers:
    0
    Trophy Points:
    245
    #6
    People would not migrate to another search engine because despite the imperfect results on Google, the results on other search engines are even worse.

    Maximizationator you seem to have this idealistic view of Google. You seem to think that they do everything exactly how it would be done in a perfect world. Unfortunately they simply are not that advanced. They are still constantly working on their algorithm to make it better. They are not using every piece of available data to rank websites. They also have to consider processing time/resources.
     
    vansterdam, Feb 18, 2009 IP
  7. Maximizationator

    Maximizationator Peon

    Messages:
    217
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    0
    #7
    I can understand that. Google has limited resources, but based on the information we know they have, we can make strong guesses as to where they're next advancements will be, and also where the others will follow/lead.

    Click tracking is an easy one though. Just look at Google Trends for websites. We KNOW they're tracking clicks.
     
    Maximizationator, Feb 18, 2009 IP