Let's say I have a page that advertises a product. And the page has good placement for major keywords, a PR 4, and good product conversion. What if I were to clone the page-- put up an identical copy of the page on the same url? The only difference between the original page and the cloned page would be that a few of the links would direct customers to a different place. Will I take some kind of a hit for duplicate content? Reason I might want to do this: I would let my organic searches keep going as they are on the main page. But I want the same page, with liks pointing to a different checkout page, for adwords. THis way, I can tell which sales are coming from adwords and which ones are organic. ...but I don't want to do it if the duplicate content would jeopardize my position in the searches. THoughts....? THanks.
Position of your page with original content will not be suffered, but the page with duplicate content most likely will get in supplemental results. The possible decision will be change of this page even on 30%.
on the same URL or domain?? The reason is just for tracking? If so there are better methods to do that than creating copies of your pages....
Yes. So, for example, the original page might be: www.mypage.com/index.html The second page (with duplicate content) would be: www.mypage.com/index2.html Seems like the easiest way to me... change two links and re-upload the page with a different name. It would take less than 3 minutes. What could be easier than that? I'd be interested to know about other methods, though. Preferably something that doesn't involve mysql, php, or cgi (three areas I am not familiar with). I only know how to design static html sites. I do use Google Analytics... and it does tell me what % of traffic comes from ads and what percent is organic... but I want to know what percent of BUYERS are from adwords and what percent is from organic. Thanks.
Thanks for the reply, belinskiy.... but what do you mean? Do you mean I should change 30% of the content of the duplicate page? Will that be enough?
No, it won't be enough. I think you need to change 40% at least. What i mean 40% is also twisting words, etc. You can check for dupes in copyscape. I'm not sure if you have two same web pages in the same domain.
it is easy to track conversions from adwords..use a "thank you" page after the payment/conversion was done and that way you know the total number of sales and the sales from adwords....is easy to count the sales from natural traffic this way I am sure there are tracking softwares to do this too but this one is the first one I could think about right now no need for that
This will be considered a "duplicate page" and both will suffer. Suggestion: I use the following method to track both Organic and PPC. Supply the function "urchinTracker" a string variable that will create a different "page". <script src="http://www.google-analytics.com/urchin.js" type="text/javascript"></script> <script type="text/javascript"> _uacct = "UA-413178-2"; urchinTracker("Organic"); or urchinTracker("PPC"); </script>
It's been my experience that you will suffer from doing this. I think the poster that said the content had to be 40% different is about right.
you have no reasons to risk! it is not so difficult to track your AdWords conversions and why don't you want to change that page a bit, change the word order, use synonyms... just keep keywords...
I don't agree with the 40% change rule. I know your using it as an example, but in my expeirence when I copy content even as little as 20%, my pages get flagged and suffer. I was working on a wordpress blog and had written an aritcle about a small city for my employer. I used various plagerisim tools in the hopes that I could keep the page out of the supplimental index. Once my site was reindexed it disappeared off the serps. It couldn't find it even when I did an exact search on the domain. I later added that page to robot.txt file with the nofollow command and a week later my site was ranking again. It could have been a flux in google's algo, but my thinking is it was flagged for duplicate content.
Hum, if Google doesn't take seriously than i will use content of your site and make duplicate site that i won't need to optimized because its already optimized by you
Well this isn't something i did a whole lot of testing on, just something I noticed on a WP blog I was toying with. What took me by surprise was that I didn't take the information from a single source. I had beeen reading stuff on .edu and .gov domains and summerized the jist of the information and written it in my own words. I even used copyscape and ArticleChecker which said I was only copying 6-8%. Now it may be that google took a deeper interest because the informaiton was on trusted domains... or it could have flagged my site as being a scrapper site... I really don't know. I can only say that I'm now a lot more carefull about duplicate content issues.
you completely misunderstood the supplemental pages and duplicate content concepts one page - duplicate or not-won't take your site out of Google And supplemental pages are more related to internal PR of the pages rather than duplicate content...