I work for a company that markets and manages the booking for numerous vacation rental companies. We recently started our own site that we are aslo syndicating vacation properties to. Here is my question, each property on our site has the same description that we syndicate across 35 different travel sites. Will our own site be dinged by search engines for duplicate content since it's the same content we send out to everyone. I am thinking of re-writng the property description to make them unique, however we manage 1300 properties and it'll be a hunge undertaking. Do you think re-writes will benefit my site?
I presume, it would... because if you send the same description to several travel sites, it will not be looking good in the long run.. because they do not know what type of content you are sending, however, bots and spiders will determine that it could be the duplicate content and maybe even marked for spam. so, i think re-writing would be an advantage even it takes time. just my 2 cents.
Well if it is your own site genuine content and you think your site get index with that content first time, while rest of the 35 get indexed later, or your site is poping up on the top in results, then its ok, otherwise its better to change it asap.
No our site will get indexed last, it's too new and only PR2, thanks for confirming my suspicions although I am dreading the re-writes!
I would link back to your site in all the syndications....see how that affects dup. content...maybe if all the sites you syndicate to have a link back, Googlebot will know whats up. Also, I blieve Matt Cutt has a WMV about this issue...you will have to check thier youtube page for WMT. This really makes you wish there was a domain to domain canonical tag, huh? LOL
My site is www.abetterstay.com. I am having as issue getting all the pages indexed and was thinking it may be duplicate content. Although the Google web master forum peeps said I need to 301 re-direct all pages that go to abetterstay.com and re-direct them to www.abetterstay.com. So thought I'd determine if this was a factor too.
it's ok really, your website will not be penalized because other websites are using your content, however the other 35 websites will lose a lot of potential traffic because of their duplicate content, and because they are your affiliates, you are also losing potential clients and conversions, so the best scenario is to rewrite the descriptions for all, here is an interview with adam lasnik from google, he talks about duplicate content at the end of the interview: http://video.google.com/videoplay?docid=1830519162896248638. good luck.
There are a number of reasons why pages don’t show up in search engine results. One area where this is particularly true is when the content at more than one web address, or URL, appears to be substantially similar at each of the locations it is seen by the search engines. Some duplication of content may mean that pages are filtered at the time of serving of results by search engines, and there is no guarantee as to which version of a page will show in results and which versions won’t. Duplication of content may also mean that some sites and some pages aren’t indexed by search engines at all, or that a search engine crawling program will stop indexing all of the pages of a site because it finds too many copies of the same pages under different URLs. There are a few different reasons why search engines dislike duplicate content. One is that they don’t want to show the same pages in their search results. Another is that they don’t want to spend the resources in indexing pages that are substantially similar.
If all of the descriptions you send out have hyperlinks back to the page on your site where those same descriptions appear, then as HighRankingSEO said, Google can get strong clues as to who the originator of the content was. But if the other sites won't let you embed links to your site within descriptions they will be displaying on their site then it's very possible... actually probable... that they will get it wrong most of the time. There is no duplicate content penalty per say... it just means the pages who have content flagged as duplicate have a little harder time ranking than those with the original version of the content. Pages with 99% duplicate content can still rank #1 at Google. They just have to compensate by doing other things better like getting extra backlinks since the content on their page (and therefore all ranking factors that are based on page content) will have been devalued. When a URL has a "real" Google penalty it is absolutely IMPOSSIBLE for the URL to show anywhere on page 1. They typically can't get past page 4, page 7, page 36, page 96, etc. depending on the severity of the penalty. Google basically buries your URL in the SERPs until you comply with their guidelines. That is NOT the case for duplicate content. It's quite possible for duplicate content pages to outrank the original version of the page and even show on page 1 or even at position 1 on page 1. Perhaps this video on duplicate content will help at about 2min 50secs into it... but whole thing is good.
I think you should write your original content for your website. Re-write content will be make some bad effects on your website during optimization. These effect will not be major but preference is that to write original content. Thanks
We have 1300 properties with their own unique property description, however that unique property description is syndicated across 30+ other travel sites...all of which have higher PR that our newly launched site.
Rewrites would always benefit your site, and you should have done that to begin with. 1,300 is a lot, but you can do it easily. I assume the descriptions aren't more than 300 words.
I've heard from sources I think are reliable that the real duplicate content penalty applies when you have duplicate content within the same site, i.e., if you had 150 pages on your site with the same content . . . or more realistically, if each of those pages had the identical content for, say, the first 500 words, then unique content for the second 500 words. The only issue that I'm aware of for duplicate content across multiple sites is that only one page will get highly ranked for a particular search term. So if you had a property description, say, for "123 mickey mouse road, orlando florida", and that description was syndicated to 1500 sites and was on its own page on each site, a search for "123 mickey mouse road, orlando florida" would likely only return one of those pages in the top results. But this wouldn't prevent any other pages on any of those 1500 sites from being highly ranked for other search terms. This is just my understanding based on stuff I've read; I can't claim to be an expert.