I know having duplicate content is a degrading point/reason/feature [what ever]. I wanted to know how much duplication is acceptable as one of my client site (both dynamic and static, when checked for content duplicate) having duplicated content. I am not aware whether their(content writers) copied it or others copied. So what is the percentage that is acceptable/OK. As its long term process to make the content original(because of many pages). Looking to work on the more duplicated pages atleast for the coming PR update Request experts to comment on. Regards.
I would keep the site as original as possible. You want to avoid that duplicate content filter like the plague.
Take a look at Answers.com. They appear to be able to avoid the duplicate content penalty much better than I would believe possible.
In my opinion even 2 paragraphs of content could get a duplicate content penalty on Google. I am not sure for the type of penalty but as I have seen that the duplicate content pages are down graded, pushed down in rankings. If there is too much copied content (80-100%) then page could also have a PR0 penalty. Well, Just my thoughts, not sure.
It's hard to imagine that Google could even notice small amounts of duplication. Yes, everybody seems to warn against this, so there must be some truth to it, but consider the difficulty Google's computers would have finding the duplication? As to the "two paragraphs", I'd doubt it - if true, nobody would ever quote from another person's pages for fear of duping. Certainly if your whole site is just a mirror image of something else that would be cause for concern. But web sites like WebProNews etc. routinely pick up copyleft content from other sites, republish it verbatin, and certainly don't seem to suffer any ill effects from Google. They used to pick up quite a few of my articles until I stopped allowing indiscriminate copying (I did that because some sites were taking hundreds and hundreds of articles - too bad I had to cut off everyone because of a few jerks abusing my generosity). So - while everyone says dupes are bad, I'm not sure they actually are.
I am currently working with a site that has the duplicate content filter hitting it. I know duplicate content is bad, but do what you want. Create a dup. site. Have fun.
I havent seen dup content really hurt a site except that G ignores the dup content - which basically means the dup content was a waste of time.
The filter seems to be similar to the sandbox. The site just doesn't rank for any big traffic search terms.
You want to try to have as much original content as possible and really try not to duplicate things that much. Google Really does penalize for duplicate content as it looks like you are attempting to optimize using a duplicate. One suggestion regardng duplicate content is to try to switch it up as much as possible however googles Search Engine is pretty damn smart so it will most likely catch this. Try to stay original and avoid duplicates like the plague because they will catch you.