I know that duplicate content in your website is bad for SEO but what about with another websites. Currently my website is about games, videos and images for entertaining purposes. Of course there's other websites with similar content. If I take something from another website, I place a source link back to them. You'll often see websites with the same articles. How does Google know which one is the real one? Can this lower your PR? Thanks
Google tries to determine which article is more relevant and will show only that one in the SERPs. It shouldn't affect your PR but will definitely affect your SERPs.
Google knows which one it found first - that's got to be a factor! It also comes down to the overall authority of your site.
Duplicate content pulled from other sites is worse than duplicate content on your own site. Google does know which was available first and no matter how many back links you get they still know. You will do much better by rewriting the information you are scraping though I know that this can be an overwhelming task.
Duplicate is Duplicate it may be you put some courtesy links on your page but google first find which page indexed first then decide which is the new. ofcourse new will be penalize. So avoid duplicate in anyway. If you need some similar option, rewrite them little bit.
@HomeComputerGames ...what.!?? if you have duplicate content on your site you will be hit hard by google and it will filter out your duplicate pages duplicate content on other sites will not be penalised at all....just look at the plethora of web2.0 sites with duplicate content ranking on page 1.
You need to write original content, ofcourse its not possible to write articles in huge volumes without copying , but you need to totally understand whats the concept and write it on your own , otherwise google can find and penalize for duplicate content.