Does any one know if the number of times content is used on the web effects how google sees it? For example if you grab a wikipedia article on Elvis it might be on 40,000 other sites. But what if the content is only on two other sites or 8 other sites? Would this content do better?
It is not better. With google panda update, it is always searching for unique content. If the article is on 40000 sites or 2 sites also same effect will be there to your site
But if I understand it correctly google is not delisting these sites but pushing them down. So for example google books and archive.org can have the same obscure book. When search for special keywords from that book they both show up on the first page but with google first of course. But with 40,000 results the archive.org content can be many many many pags down. Also I see this on sites I have setup to help friends find information that I dont care about how they rank on google. One site has 6,323 Christian public domain books and articles. Nothing at all is unique about it as every thing is dublicate and there is basicly zero unique content. Yet Panda did not effect it at all. And it still gets a few thousand unique a month from people around the world. And from looking at the keywords it seems most of the traffic comes to content that is more limited in the number of instances of it on the internet.