So...Google Penalizes You For Ununique Content?

Discussion in 'Search Engine Optimization' started by nameonthecake, Aug 22, 2006.

  1. #1
    I was wondering... how do lyric websites go around this....
     
    nameonthecake, Aug 22, 2006 IP
  2. verbs

    verbs Peon

    Messages:
    245
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    According to the SEO tutor I just read, that is the case, yes. My biggest problem is my site is a repository of information, therefore 90% of my information is duplicate. I must be getting penalized for this somehow.
     
    verbs, Aug 22, 2006 IP
  3. AnaB

    AnaB Peon

    Messages:
    1,336
    Likes Received:
    36
    Best Answers:
    0
    Trophy Points:
    0
    #3

    If you quote a part of the info and then provide a link to the actual site, you should be fine. Just add a few lines about why you think the article is helpful or important and link to them...the best way to do it... That';s what I do with my sites when I need to put up info available from other sites. (unless I have something exceptional to edit or add or to critic it).
     
    AnaB, Aug 22, 2006 IP
  4. Mong

    Mong ↓↘→ horsePower

    Messages:
    4,789
    Likes Received:
    734
    Best Answers:
    0
    Trophy Points:
    235
    #4
    I don't believe that Google is very efficient to find and penalise duplicate contents.
    In my practical experience it depends on more on your value in google's algo and your backlinks quality.
    If you score good with google in term of confidence then your duplicate content will rank higher than original content holder website.

    Technically its not very easy to check duplicate contents.
    Google always says that dynamic pages are indexed easily as static pages but practical experience shows its not true. :)

    Again its all about your value in google.
    How much google has confidence in your site.
     
    Mong, Aug 23, 2006 IP
  5. hextraordinary

    hextraordinary Well-Known Member

    Messages:
    2,171
    Likes Received:
    115
    Best Answers:
    0
    Trophy Points:
    105
    #5
    This discussion has been going on and on and on and on.... The truth is that with so many aggregators around the net nowdays, it will be very difficult and resource demanding to discover what is duplicate content and what is aggregated content.

    I think this has been made into much of a bigger deal than it really is. I believe there is no direct penalty for rehashed materials, however, basing a site on none original content will not get you ranked as high anyway.
     
    hextraordinary, Aug 23, 2006 IP
  6. sachin410

    sachin410 Illustrious Member

    Messages:
    6,422
    Likes Received:
    573
    Best Answers:
    0
    Trophy Points:
    410
    #6
    I dont think Goolge penalizes sites for common content.

    I have a bunch of article sites. There is hardly any orginal content on them. In fact two of them have the same content and are on the same server.

    Still all of them get SE traffic.

    However, I feel it is difficult to rank well with duplicate content than with original content.
     
    sachin410, Aug 23, 2006 IP
    Mong likes this.
  7. Cryogenius

    Cryogenius Peon

    Messages:
    1,280
    Likes Received:
    118
    Best Answers:
    0
    Trophy Points:
    0
    #7
    The easiest duplication for Google to spot is when two pages are byte-for-byte identical, and the same length. If you have two URLs that generate the exact same HTML code, then there's no mistaking that as duplication. I used to have a test site which was a mirror of my main site, but I've since taken that down.

    Cryo.
     
    Cryogenius, Aug 23, 2006 IP
  8. Voasi

    Voasi Active Member

    Messages:
    1,054
    Likes Received:
    43
    Best Answers:
    0
    Trophy Points:
    88
    #8
    Lyrics sites don't get around it, its just that some lyric sites are more powerful then others. If a site is more powerful, then the "original content" will become their own, not the originator.

    Yes, but what if you had unique content.

    Of course you get traffic from Y!/G/MSN, but if you had unique content to the web...your traffic might double...or triple...because of the uniqueness.
     
    Voasi, Aug 23, 2006 IP
  9. Cryogenius

    Cryogenius Peon

    Messages:
    1,280
    Likes Received:
    118
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Where do "original" lyrics live, anyway? Not on the web I bet.
     
    Cryogenius, Aug 23, 2006 IP
  10. MattUK

    MattUK Notable Member

    Messages:
    6,950
    Likes Received:
    377
    Best Answers:
    0
    Trophy Points:
    275
    #10
    I think to get penalised for dupe content the entire page has to be very similar. There are too many situations where the 'majority' of the content on a page would be similar for this to happen.

    Imagine a two sites where the paragraphs, titles, metas, menus and navigation are all the same. This I'm sure would get a duplicate penalty. In the second instance everything is different apart from the bulk of the paragraph. I don't believe this would be penalised.

    Aprt from lyrics and articles sites, consider RSS syndication. The whole concept would be flawed if any pages delivering RSS were penalised.
     
    MattUK, Aug 23, 2006 IP
  11. stackman

    stackman Peon

    Messages:
    840
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    0
    #11
    I agree that the dup content penalty is more of a myth than actuality. Here's why.

    When content is copied from one site to another, the original content has been around for a while and probably has already been indexed and ranked. It's older and more established in the rankings than the new copied content.

    If the search engine sees both pages as being identical in a particular search on a particular keyword, the tiebreaker would logically be the content that has been indexed longer -- not so much a penalty but a natural and logical choice.
     
    stackman, Aug 23, 2006 IP
  12. aletheides

    aletheides Banned

    Messages:
    2,016
    Likes Received:
    61
    Best Answers:
    0
    Trophy Points:
    0
    #12
    I would think it would take google a lot of resources to check copy against copy...thats why I would agree with dup content is a myth theory.
     
    aletheides, Aug 23, 2006 IP
  13. MattUK

    MattUK Notable Member

    Messages:
    6,950
    Likes Received:
    377
    Best Answers:
    0
    Trophy Points:
    275
    #13
    It's easy, check copyscape, it does exactly the same job
     
    MattUK, Aug 23, 2006 IP
  14. Dekker

    Dekker Peon

    Messages:
    4,185
    Likes Received:
    287
    Best Answers:
    0
    Trophy Points:
    0
    #14
    When has plagiarizing ever been positive?
     
    Dekker, Aug 23, 2006 IP
  15. idotcom

    idotcom Well-Known Member

    Messages:
    522
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    108
    #15
    If duplicate content can't be discovered, then everything would be unique.

    Considering the amount of pages indexed, it would be a huge load. But, I'm sure there is at least a small effort to recognize duplicate content. Now what they do when something is discovered, is what matters.

    Ever see the "omitted/similar results" message on google search results?

    I imagine it would have to be a dead smack photocopy in order to drop (penalize) one or the other, otherwise, they may be accidently, automatically penalizing press, news, feeds, and other general public informational sites.

    My 2 cents
     
    idotcom, Aug 24, 2006 IP