Hey, I just want to know whether google actually ranks your pages lower for duplicate content. I don't mean the actual page with the duplicate content, I mean the site in general, like the homepage. Will it get ranked lower for duplicate content on other pages of that site or not? The reason I want to know is because I have found a site which has loads of content of stuff I need for my dog site and I would really just like to make only a few changes before posting it. Will I lose my place on a search term that leads to my homepage by doing this? Or do you think just that page will not rank as high?... Please tell me all you guys know. Thanks.
Morally, I'd say it was wrong. I've no experience of this but have read that Google will favour the site with the most links. Other sites could get put in the supplemental index - ie fare very badly in SERPS. With regard to duplicate content of pages WITHIN the site - be careful. Even if the words are not the same - and you change some paragraphs around - Google will often put similar pages in the suplemental. I've had this happen even when I've changed words.
I know changing things will not make it 'unique' but IO just want to know whether it will affect my homepage ranking, I don't really mind about that page's ranking, just the homepage as I really need content but only want top ranking for my homepage.
Excessive duplicate content can take out the ranks for your entire site. While it may be time consuming, focus on unique content and add it slowly as your time permits. Besides duplicate content, increasing the number of pages to your site quickly will not appear natural. I've worked on projects where charts they used were very similar to other Websites. If you are adding pages just for your existing visitors, and do not care if the pages are indexed, then disallow them in your robots.txt file. Then you won't have to worry about duplicate content influencing the ranks of your entire site...
i agree with snowbird. if your duplicate content pages are less than 5% of your whole website than those pages will not get listed in the search engines.. if more than 50% of your pages are duplicate content then you might be up for a penalty. whats in between 5-50%, i have no clue
Thats a great idea, I never really thought about that... Thanks, maybe I will do that, I will write some myself and do some from the other site and just disallow it from being indexed. Thanks guys.
Google will not penalize you, your content simply won't be indexed as unique content. Content is duplicated all the time, in legitimate fashion. That's how news and press releases get syndicated. It's important that you specify the author and source URL of the original document. As far as getting credit for original content - Google doesn't go by the number of back links first, but by the age of the document. The first source it can find gets credit. If there are multiple sources for the same topic, and each one counts as unique and original, then back links and overall site ranking and relevance are taken into account.
Why is it important that I specify the URL and the author of that original content? Does google not that down or something? Thanks for the tip.
It sounds like you really know how G works and what influences their decision on which version is the original... May I ask where do you get your information from?
Write your own content my friend. Google will notice your scarping and will discredit you. It may not neccesarily lead to ranking penalisatioin but you may not get the duplictaed content pages indexed
I would have to disagree with that statement... I have a 5+ year old site which is pretty known in its niche with thousands of pages indexed and hundreds of them in top 10 in G, the site has LOADS of text and images lifted off of it, yet 50% of the stolen material actually ranks better than the originals on the site... I just want to point out that it is not all that simple and up today, G does not have a 100% working system to verify which is the original and which one is not. Hope that helps
I have heard this from other siteowners as well. The other thing you should consider is that if the owner of the content that you are lifting - discovers your site - they could report you to Google. Since your pages are newer - it wouldn't be too hard for them to know who was in the wrong. Then you could risk getting banned. Bottom line - write your own content.
If you wanna report it to Google you have to write a letter in the mail, who's really gonna do that?!
Are you kidding? Plenty of people. If I worked hard on a site, and it was my business - my income - I'd do whatever it took to get rid of a thief.
A dog site is especially easy to create unique content for. Simply do a different page for each breed. Each page should also rank highly, due to being tightly focused.
Duplicate Content on a single domain is a negatory, but across multiple domains, Google simply doesn't have the computing power to cross-reference all body content.