How cna you find a post when there are no links to it? just because Google doesn't show any links, that doesn't mean there are none! It's a well known fact Google only displays a subset of links. If yuo got there via a link, the page is likely to get PR from links.
They do but that has nothing to do with PageRank. Their patent filings suggest they do use that and probably traffic from the SERP clickthru's for ranking. But despite tests where people tried to boost a forgotten page's ranking just by traffic means there has been no hard evidence that I've seen suggesting traffic is a major part of the ranking algorithm.
I doubt traffic is a measure for rankings otherwise there will be the rich get richer and poor get poorer.
That's why it can't be a major factor. But when all else is equal, it would make sense to assume the one with higher traffic levels is more relevant. If it gets more interest it is likely to be a better resource.
This thread really has got me thinking.. At first I was convinced Google might monitor clicks as part of their PR calculations, but the argument of this making bigger sites bigger and smaller ones smaller has huge merit! Let's break it down.. How can Google monitor how much traffic a site gets? Clicks in their search engine. Monitoring where people go with the Google Toolbar. Google Analytics. I can't really think of any others. Item 1 can certainly be abused as people can write a good blurb for their site simply to get visitors. A visitor might then click, and not find the information they're after. I don't see much downside to item 2? Merge this with the fact the Beta Toolbar can now able to store your bookmarks and this is a very powerful metric. Item 3 needs no explanation..
Do realize PageRank != Ranking of a page!!! It's not the same so be sure it's clear what you are talking about in these discussions.
IMO, there is no way traffic is a part of the PR algo.. i have a PR4 ( went from 0->4 this update ) with very low traffic.
Calculating PageRank as it is with all its iterations is complicated enough. Adding traffic stats to it will make it a computational nightmare.
I haven't seen hard evidence either. But with an algorithm with "thousands" of variables its almost impossible to single each individual variable out for testing. For example any one of those variables could be a multiplier (e.g. known reputation of a domain owner multiplied by the number of years the domain has been owned by that person), and anything multiplied by 0 is 0. Across thousands of variables any number of 0s could be cancelling out other variables and oh God my head hurts...
Yeah, I have thought about that before... I figure they have to give a .0001 (or some equivalent measure) to various parts of the algo by default... screw up bad enough, something goes to 0 and that piece of the algo is out of the computations for a particular site.
That and Google Analytics, adsense, the 'Google' image on sites with a Google Search box will all provide traffic info to G.
Google will certainly be taking traffic levels into account in their algorithm but probably only using search results as a source. If spamsite.com is 3rd in the results and legitimatesite.com is 5th in the results and gets more clicks then this is a clear indication to google that the site in 5th should be placed higher in the results. Search results are supopsed to show exactly what people want, this is googles definition of a 'perfect search engine'. If people don't click on the number one result then it shouldn't be number 1. Clicks on the search results are an excellent spam spotting tool. Humans can spot spam better than any search engine.
I think you are right mad4. Maybe traffic will be used in a similar way to PR. PR was conceived to estimate a page's 'importance', and the traffic a page gets sounds like a good way also to gauge a page's importance. If I was designing a search engine, I would perform the keyword matching first and then maybe sequence the list by 'importance' (which might be a combination of PR and traffic).