Hi guys How exactly does the duplicate content penalty apply? I have thousands of pages on my site, most of which can have specific filters applied to them (eg red, blue or yellow widgets; round, square or oval widgets; Greek, Syrian or Swiss widgets, etc - all fictitious examples of course) and sometimes the pages can look identical due to the same products existing in the database for different combinations of attributes that have been filtered. This obviously looks like duplicate content, but isn't actually. Another example would be where a search for a product returns 1 result, but there are options to sort results by age, weight, height, colour, etc. With only one result, all ordering will result in the "same" content. How does one get around this sort of issue? All thoughts appreciated...
There is no such thing as a duplicate content penalty. Your pages will still be found with their unique terms. For common terms on multiple pages, you will have one page possibly two show up but not all pages.
Duplicate content is a serious offense in google's eyes. Basically, your site will stop showing adsense ads and will be dropped from serps.
Zamc270, I've been wondering the same thing. The idea of duplicate content penalty is confusing to me as well. Also, I'm wondering is there such a thing as duplicate content on your own site? I mean, I have the same 100 words at the top of each web page on my site, but different content below.... will I be penalized for duplicate content? Also, what about sites that are quoting non-copyrighted literature or song lyrics? Are the really being penalized for duplicate content? If this is the case, even Wikipedia should be in the penalty box here. I see the need for filtering dup content, but how is this determind by google... and what, exactly, is the penalty?
I understand your point, mhmdkhamis, but I have a site that is 100% original, and is the only site of its type that allows such flexible views of the data. There's no way I can remove it just because Google penalises me for it. I can try and work around Google's short-sightedness, which is what I'm trying to figure out how to do. It happens to be 15000+ pages large (60000+ if you count the 3 other languages I have translated each individual item page into) because I have been collecting the data electronically as a hobby for 20 years!
I have never understood the duplicate content issue. I have the same homes on my site as any other agent in my area yet they are not all being devalued?