In promoting one of our sites and producing unique articles for that site, a questions that has been raised is that of duplicate content and triggering the Google duplicate content filter. For example, if you are producing unique articles, then posting these to your website, then submitting them to article directories and to press release sites, are you going to get hit by the duplicate content filter for having the same content on your site as posted to a variety of other sites? We also produce these articles as XML feeds that we will syndicate. Is this a problem or is the alternative just to add some unique articles to your site (to build content) whilst submitting other unique articles to the article directories (obviously with links back to you site) to avoid any problems? Anyone got any experience of this?
I think it depends on which site is more of an authority. If your site has strong link pop, and gets spidered first, I think you can avoid a dup content penalty.
Thanks for the response guys.....can we explore this a little further? If they are indexed on my site first, does that automatically mean that the unique content will be seen as mine or does it depend on which site is seen as the most authoratitive?
I dont think anyone knows for certain. MY logic tells me that if they are found on your site first, you get credit regardless of your "authority" - whatever "authority" means. But i could be wrong. Perhaps you can test this out for us and let us know happens.
I suppose the the sure fire method would be to keep articles unique to the site and unique for publication.......but then if people syndicate their content by RSS are they getting penalised for that or do RSS feeds trigger and 'exclusion' filter One question seems to lead to another with this so I guess I'll have to do some testing.