Strange stats from split testing ads

Discussion in 'Google AdWords' started by joebloggs, Feb 21, 2008.

  1. #1
    Hi,

    I've been testing ads in the following way.

    - I take a winning ad and test it against a new ad.

    - I make three copies of the old ad, just in case the new ad underperforms.

    - I let enough impressions accumulate until i decide whethere the new ad is better than the old one.

    - Below are the performance statistics for two IDENTICAL ads. Same title, same line 1, same line 2, same display url, same landing page.

    I don't understand how the stats could be so different, considering the number of impressions and clicks.

    I find it a bit perplexing. Any ideas?

    Thanks


    Edit Active 0.2% 72 60,908 0.11% £4.21 9.7% £0.61 7


    Edit Active 0.2% 71 61,549 0.11% £4.25 15.5% £0.39 11
     
    joebloggs, Feb 21, 2008 IP
  2. toast_the_most

    toast_the_most Peon

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    If the ads are identical then these stats don't tell you anything. It's just like having one ad with 143 clicks and splitting the clicks/rest of the data in half any which way you like. The data is never going to match. But there's no reason for it to be different - just coincidence.

    It's like tossing a coin 100 times and getting 70 heads, 30 tails. It's mathematically odd but it happens.
     
    toast_the_most, Feb 21, 2008 IP
  3. joebloggs

    joebloggs Peon

    Messages:
    265
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #3
    In this case the difference between their performance is very great.

    There comes a point in your coin tossing analogy where the heads and tails will start to even out. Maybe I have to wait longer and see when the ads start to even out in terms of their performance.

    I might be able to use this as a yardstick for determining how many clicks I need before choosing a winning ad.
     
    joebloggs, Feb 21, 2008 IP
  4. CustardMite

    CustardMite Peon

    Messages:
    1,138
    Likes Received:
    33
    Best Answers:
    0
    Trophy Points:
    0
    #4
    CustardMite, Feb 21, 2008 IP
  5. joebloggs

    joebloggs Peon

    Messages:
    265
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Hi Custard Mite,

    Would you be good enough to run me through (quickly) how you came to the conclusion that the data is thus far statistically insignificant?

    I've read your article I'm just struggling to apply your thinking to my problem

    Thanks.

    Joe
     
    joebloggs, Feb 21, 2008 IP
  6. joebloggs

    joebloggs Peon

    Messages:
    265
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #6
    There is a 63% difference in the conversion rates with nearly 70 clicks. I'm just not getting it...
     
    joebloggs, Feb 21, 2008 IP
  7. CustardMite

    CustardMite Peon

    Messages:
    1,138
    Likes Received:
    33
    Best Answers:
    0
    Trophy Points:
    0
    #7
    The easiest thing to use is this:

    http://www.splittester.com/

    Since you're testing conversion rates, read 'conversions' instead of 'clicks' and 'conversion rate' instead of 'ctr'.

    Here, the significance of 7/72 vs 11/71 is approx 85%, which is lower than the 90% or 95% levels that most people use.

    But your test is pointless anyway - I can understand running 3 identical adverts to ensure that 75% of your traffic sees the same advert, to reduce the risk of a bad advert, but the results have to be added together to compare them to another advert.
     
    CustardMite, Feb 21, 2008 IP
  8. joebloggs

    joebloggs Peon

    Messages:
    265
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Right, thanks for the tips.

    So I have to find the mean values for the duplicate ads before I draw any conclusions regarding performance compared to another ad format.

    I'll use the tool you suggest.

    Thanks.
     
    joebloggs, Feb 21, 2008 IP