I have a question regarding some discussions saying that Google will penalize sites if it finds duplicate content on them. Does this always happen? And does this involve having *exactly* the same content (templates, text, links etc.) copied 1:1 or partially same content? (for instance only some articles) Because in some cases the second does not seem logical and I will give an example: Lets say that I am an article writer and I have my own blog where I post my articles. But I also like visiting several forums (as a simple user, not being their owner or admin), where I also like posting my articles. So I write an article, I post it on my blog and after 1 minute I also paste it on some forums to share it with other users of them. After my submission, Googlebot visits my blog, finds the article there, indexes it and later visits the forums I also published it finding exactly the same text in them. What will happen then? Will Google penalize the forum, because they may think that someone has stolen my article? And what if happens that Googlebot catches first the forum and later my blog? Will my blog be penalized? Also, in case of 1:1 copied content, what happens with sites mirroring each other (eg. sites containing official PHP documentation, Linux docs etc.). All these questions give me some doubts that same content always means that sites will be penalized. Any opinions about this?
I think, if you place your article with a link back to your site somewhere on forums and also on your blog, then thanks to that link to your site, G understands that this is your content.
I think google starts by penalizing the page not the site. I have some pages that lost their indexing and I think it is because the content looks like lot's of other content on the web. The overall site still has many indexed pages.
post it on your site first and make sure google indexes it. Then post a snippet of the article on the forums. If the forum is more trusted by google, the article will show higher on the forum that on your site. this happened to me before on wikipedia, one of my clients bio was on his site for a year before it was on wikipedia and within a few days of being posted it was ranked higher.
Pages with duplicate articles just won't be indexed by Google. Google doesn't care much about the template.
google will not ban or panalize sites that contains duplicated content. if you use unique content to your site however, you'll have great advantages against competitors who doesn't. my experience tells me so. i currently have 2 sites that have some traffic right now. one uses unique content, the other one uses duplicated content. both are keyword-targeted site (niche), both earns me money, both are indexed and ranked by google. the only differences are these: 1. my unique site has less than 10 backlinks, while my duplicate site has over 300 backlinks. 2. my unique site ranked top 5 for my targeted keywords while my duplicate site ranked top 20~30 3. alexa rank is 1,000,000 for my unique site, and 6,000,000 for my duplicate site 4. my unique site has PR0 while my duplicate site has PR3 (for sure!) 5. one site contains 15 pages all very unique articles, the other contains 100 pages of duplicated PLR articles. it's fine to use duplicated content for your site. you just need to do a little more work to get it indexed and ranked.