Hello, I'm just looking for some thoughts on alternate possibilities for optimizing a couple sites that have duplicate content in which Google will not rank. I know their may be the possibility or affiliation linking of Pay per click advertising. Are there any other ways of optimizing some duplicate content sites? Thanks in advance Brad.
Best thing to do is just take down the duplicate content and replace it with real content. No matter how much optimizing and advertising you do, at the end of the day, its still a cut and paste website that no one will really be interested in (most of the time).
You can try adding reviews in your own words at the start and end of the pages to give it some uniqueness. Changing page titles and headlines might also help to some extent. A freelance writer might make the job easier for you.
I think google will index both of them but the older with get the good position try in future to make them different
It's pretty unpredictable which duplicate page gets dropped. I put 2 identical pages on 2 of my sites to test this. Yahoo and MSN showed one page whereas Google showed the other (the younger site).
What's the point in having dupe content? It screws with branding, can confuse users and works against everything you are trying to achieve. If you just want a crappy site that gets clicks, run it through a markov script.
Thanks for all the replies. The duplicate sites aren't actually my sites. Their sites through a client of mine who insists on running sites with RSS feeds and google ad sense ads and wants to look at ways of optimization as some pages are duplicate. Me personally, the sites I do myself are usually all original unique content. So as this being the first time I've worked with some duplicate content sites, I just wanted to get some advice from the pros on it. From what I can tell, it's definitely something i wouldn't do myself. I'm all for the unique. Thanks for the feedback!