I'm sorry to bring up a subject that has been hacked to death in this forum , but reading through some recent posts there is evidence that Google has changed its algorithm for duplicate content. I'm working on a new site witha professional designer and SEO bod, which is going to be an authority site on hypnosis. A key part of the site will be a database of articles. I was going to initially source articles from sites like www.goarticles.com, as well as soliciting original ones. But I'm worried now that using articles published elsewhere, even legitimately, will cause my new site to be penalised. What do people here think?
Google has stated in the past that articles and news do not count as duplicate content. Beyond that (like how they actually determine something is an article), I have no clue.
I have the same concern - but I think it has more to do with overall page content combined with the originality of the site. If you dup another page totally -including the layout and style - thats a problem --- a small percentage of your pages having dup content with the rest being original, no problem. Think about newspaper sites for example - most papers have syndicated content. So articles are reprinted all over - so you have duplicate content. But the pages are all formatted differently and there is a good percentage of original pages on the site. Thats just one example. But you get the idea. Google only cares about
Mybe yuo can break up the articles and put your own snippets between paragraph to ensure it's bloody hard to be considered a duplicate.
I was concidering something like that once for a couple of sites, but my plan was to include some sorts of newsfeed in the articles. that would kinda alter them and then it (hopefully) wouldn't be counted as duplicate material.. never got to trie it though.