Hi, I've been testing ads in the following way. - I take a winning ad and test it against a new ad. - I make three copies of the old ad, just in case the new ad underperforms. - I let enough impressions accumulate until i decide whethere the new ad is better than the old one. - Below are the performance statistics for two IDENTICAL ads. Same title, same line 1, same line 2, same display url, same landing page. I don't understand how the stats could be so different, considering the number of impressions and clicks. I find it a bit perplexing. Any ideas? Thanks Edit Active 0.2% 72 60,908 0.11% £4.21 9.7% £0.61 7 Edit Active 0.2% 71 61,549 0.11% £4.25 15.5% £0.39 11
If the ads are identical then these stats don't tell you anything. It's just like having one ad with 143 clicks and splitting the clicks/rest of the data in half any which way you like. The data is never going to match. But there's no reason for it to be different - just coincidence. It's like tossing a coin 100 times and getting 70 heads, 30 tails. It's mathematically odd but it happens.
In this case the difference between their performance is very great. There comes a point in your coin tossing analogy where the heads and tails will start to even out. Maybe I have to wait longer and see when the ads start to even out in terms of their performance. I might be able to use this as a yardstick for determining how many clicks I need before choosing a winning ad.
The clickthrough rate is virtually identical, and the difference in the conversion rate isn't statistically significant. You should probably use a significance tester to check these things... Of course, whether you should wait for a result to be statistically significant is another question entirely: http://www.epiphanysolutions.co.uk/...g-is-statistical-significance-over-rated.html
Hi Custard Mite, Would you be good enough to run me through (quickly) how you came to the conclusion that the data is thus far statistically insignificant? I've read your article I'm just struggling to apply your thinking to my problem Thanks. Joe
The easiest thing to use is this: http://www.splittester.com/ Since you're testing conversion rates, read 'conversions' instead of 'clicks' and 'conversion rate' instead of 'ctr'. Here, the significance of 7/72 vs 11/71 is approx 85%, which is lower than the 90% or 95% levels that most people use. But your test is pointless anyway - I can understand running 3 identical adverts to ensure that 75% of your traffic sees the same advert, to reduce the risk of a bad advert, but the results have to be added together to compare them to another advert.
Right, thanks for the tips. So I have to find the mean values for the duplicate ads before I draw any conclusions regarding performance compared to another ad format. I'll use the tool you suggest. Thanks.