I realize a lot of factors go into how search engines decide search results. Is there any way or chance they’d factor in which search results actually receive a click? It’d have to be reversed weighted, otherwise it would be tough to unseat the top three spots. 1) weight value = 1 2) weight value = 10 3) weight value = 100 And so on… Or would this be too easy to scam?
is it possible? sure. is it all that likely? not really. how would this increase relevancy? the answer is that it really wouldn't; what motivates users to click is not the same as what makes them content with the results. misleading titles would make it far too easy to snare extra clicks to get extra traffic. the other reason it's not terribly likely is the one you pointed out, it'd simply be too easy to spoof clicks & increase your rankings. that being said, you still should play around with your pages to see what gets the best click through rate (while remaining relevant-- otherwise you'll just get traffic that clicks, gets frustrated, then leaves) for obvious reasons. it may not boost your rankings, but it'll sure help you get the most out of them.
Google would measure click and I am sure on page time as well. Under 10 seconds Google knows not worth much. 10 to 30 seconds = Could be good 30 to 1 min = getting better 1 min = hot 5 minutes = red hot on target = super relevant or they went on break
Many search engines are considering this to some extent. This concept is known as click popularity / hit popularity. The idea behind click popularity is that the SERPs are influenced by the number of clicks a given page gets from a query. If you are using Google sitemaps, you will see that Google shows the number of times your site appeared in SERPs for particular search terms and also shows the number of time your link was actually clicked. This might be an indication that Google is also giving importance to click popularity. Your website title should be relevant enough to get people to click on it. However, obviously this technique can only be true for the results on the first page (and maybe to some extent for the second and third pages also).
I think it would be good for engines to look at clickthrough rate and visit length. They could also take into account whether or not the user ran the same or a very similar search again or they stopped looking. There would be a danger of abuse as with any raning system really.
Actually thats not true...there is Google analytics. Another issue is time on site is not going to be a determing factor in rankings.
I agree with cashuser. I think it would be a great tool for someone to come up with a user search pattern. But don't you think this will run against the search engines business model if they show a lot of failure to find information?