Duplicate Content

Discussion in 'Search Engine Optimization' started by Nindja, Aug 3, 2006.

  1. #1
    Google’s duplicate content filter is notorious. There have been many articles written about it, many forum posts and discussions regarding it, and speculation that runs rampant. What there has seemed be a lack of, however, is discussion as to whether or not other search engines apply duplicate content filters for their search results, like Google does so famously. Why do you suppose that is?
    Perhaps it’s because Google has it all wrong, at least in the eyes of those getting penalized for duplicate content. It is, in fact, the ones being penalized that created the original content, but are no where to be found in the SERPs. It feels to those webmasters as if they invented a new product, and because of that product’s popularity duplicates popped up everywhere. Flattering at first, but in the end, it is one of their competitors who duplicated the idea, who patents it as their own. To add insult to injury the original creator of the product gets fined for claiming it was their own. It makes them want to scream “Liar, liar, pants on fire…” Ok, I’m being a bit facetious, but it certainly feels unfair. Is it cheating? Yes, in a way, it is. But this is how the game is played, at least for this moment in time with Google.

    While it does seem that most search engines apply a filter of a sort, it appears that only Yahoo and MSN have the technology to analyze where the content originated. And because their filters are working like they should be, you aren’t hearing anyone complaining about it. People generally don’t complain about something that works right; they complain about things that are working incorrectly; in fact many times a bit too loudly.
     
    Nindja, Aug 3, 2006 IP
  2. Robert Rich

    Robert Rich Peon

    Messages:
    4
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Are you saying that the same material cannot appear elsewhere within a person's range of sites?
     
    Robert Rich, Aug 3, 2006 IP
  3. Nindja

    Nindja Peon

    Messages:
    235
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    yeah sites no up in google ...
     
    Nindja, Aug 3, 2006 IP
  4. marketjunction

    marketjunction Well-Known Member

    Messages:
    3,779
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    183
    #4
    Having some duplicate content won't kill you. For instance, let's say you have websites on Widget Repair and Widget Repair Tips. You write an article on 5 must-know tips for repairing widgets. Your websites largely feature original and exclusive content. Adding this piece of content to both sites won't kill your website. If anything, the page itself could take a hit, but that's not likely since there are only 2 copies in the whole world.

    Duplicate content penalties are mainly going to hurt websites that are solely built on it or where duplicate content makes up a large part of the site. Think about it. If your website is 95% duplicate content and it gets penalized, you have 5% of a site.

    Therefore, having 1-5% duplicate content could be acceptable if it's great stuff for your visitors.
     
    marketjunction, Aug 3, 2006 IP
  5. JordanLH

    JordanLH Peon

    Messages:
    22
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Yeah definitely. You can have some, but not too much. It used to be as easy as adding an intro sentence at the beginning. Now I'm learning its a lot harder than that.
     
    JordanLH, Aug 3, 2006 IP
  6. Robert Rich

    Robert Rich Peon

    Messages:
    4
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Ok let's say I have a site without Adsense, then I take portions of it to create an Adesense coded site. Is that legal?

    Thank you very much!
     
    Robert Rich, Aug 4, 2006 IP
  7. Nindja

    Nindja Peon

    Messages:
    235
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #7
    While it does seem that most search engines apply a filter of a sort, it appears that only Yahoo and MSN have the technology to analyze where the content originated. And because their filters are working like they should be, you aren’t hearing anyone complaining about it. People generally don’t complain about something that works right; they complain about things that are working incorrectly; in fact many times a bit too loudly.
     
    Nindja, Aug 4, 2006 IP
  8. glennhefley

    glennhefley Peon

    Messages:
    73
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I think people complain, and their complaints normally focus on the bigger shock value. If you stated the same complaint about MSN, as is being focused on Google, the general response would be ".. well duh" The fact that it is Google is what makes the complaint... complainable.. if that is a word.

    All search engines filter for duplicate content. They have done so for years. The reason it is news-worthy is because the usage of duplicate content has increased over the last couple of years to epidemic proportions.

    It is not a matter of "legality" (at least if the duplicate content on your site is your own), but a matter of validity. If two sites have the same content, then only one is listed ... why list the second one? its the same stuff. And like the other guy pointed out, it is a matter of percentages.

    The trouble is that people forget the goals of a search engine. The first goal is to give a web searcher relevant, unique website listings for a given search. And the degree that search engines can do this, is what makes them successful or not.

    How often are you going to use a search engine whose first page of results is 10 websites, all with the same article on them? I would use it twice, just to make sure the first run wasn't a mistake.

    My question, is why would you duplicate content across several sites? The answer to that is usually because the owner wants to have two chances of being found on the search engine listing instead of one, and doesn't want to be bothered with writing new copy. To me, and most search engines, this is just another form of market spamming, and the justification of filtering websites from search listings.

    Web site owners see this as being penalized, the search engines see this as accuracy maintenance. They are not "policing" the web, they are insuring that their status as a "good" search engine is stable.
     
    glennhefley, Aug 5, 2006 IP
  9. erkanbs

    erkanbs Active Member

    Messages:
    207
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    58
    #9
    if you are seo expert you can use duplicate content like your content and you can rise your rankings on search engines.
     
    erkanbs, Aug 5, 2006 IP
  10. marketjunction

    marketjunction Well-Known Member

    Messages:
    3,779
    Likes Received:
    187
    Best Answers:
    0
    Trophy Points:
    183
    #10
    That might be correct with some bad site owners. However, here's an example. You run a general sports site, like ESPN.com. You also run a sports site dedicated to pro baseball, like MLB.com. You write a story about Player X and Team X coming to agreement on a deal and what it means for both of their futures. The article applies to both sites. There's nothing wrong with that.

    Personally, I would rather one site print part of the article and reference the original copy on the other site, but either way is fine. If you are establishing a name for yourself in the industry (or you are established), rewriting the same article is not a good idea.
     
    marketjunction, Aug 5, 2006 IP