i recently set up an article site with around 25,000 articles, now i assume the article have been used many times by many other website, so what i have noticed is the total amount of pages indexed by google dropped from 500+ to 300+ is this because of the duplicate filter?
Yeah that is exactly it. Basically the filter is there to prevent serps being clogged up with the same article (or slightly different version of it). There is some dispute as to what google class as duplicate content - i.e. the % of similar text that will be allowed through the filter. My philosophy is not to change articles to make them seem like new articles. Instead I have lots of unique articles on my site and do the SEO to make it an authorative site. Google is not perfect and it is possible to rank higher than the original author of an article with the article they wrote - not fair, but a good strategy if you can pull it off. I advise against these amssive article dump style jobs - better to use articles much more selectively in my opinion. Good luck Notting
Get links back to the articles, even if they aren't original and that should pull them out of supplemental and give you some ranking.
Yes, links to these pages will help but it wouldn't be easy.. try to have unique META tags and dynamic content on these pages as well
Maybe it is not an issue of G duplicate filter. Perhaps some articles may include the time issue. Do you keep them all listed at the first website pages?
This filter is used in order to avoid duplicate content so people cannot flood Google with the same info over and over again