Does anyone know how google judges content from site to site. I see quite a bit of talk about unique content, but how does google and other search engines go about comparing articles from site to site. If I publish an article on someones site, and then use the same article on my site does that drop my ranking for that content?
Nobody knows! We only try to guess. If someone tells you so, s/he is a liar. Google is good at keeping it's technique. It does. Well in my experience, it's true. What you need to do is to publish the article at your site 1st. Then after Google indexed it for quite some time, you published it for others. That way Google will recognize you as the legitimate owner of the article and give more strength to you. (and less for others) Some suggest that 40% diff is considered unique. However, nobody knows for sure. It's a secret kept well by Google.
It is definitely up for debate. I have heard 40% difference as well, but I have also seen sites that take complete verbatim snippets of articles (with a link of course) and have success. As SolomonZhang noted, if you have original content/article, post it on your site FIRST, let it get crawled and then publish it (or a rehashing of it) on other relevant sites. If you have unique content and update it regularly, an RSS feed would be a good idea also.
Maybe we're talking about different things. WRT duplicate content, who has it "first" doesn't matter, that's correct. I'm not suggesting that the age of the document has preference in this context. I'm saying that providing links to an original source will provide more link strength (generally) for that source so johngodfrey's idea of publishing elsewhere and then duplicating it on his site might not be the best way to go about it.
If age isn't a factor, how is the original source important? I thought the original doc would be determined by using the source it was located the first time it was ever spidered.