Duplicate Content Is Not BAD for SEO!

Discussion in 'Search Engine Optimization' started by PlayPassGames, Aug 14, 2008.

  1. MRniceGuy007

    MRniceGuy007 Peon

    Messages:
    446
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #21
    Duplicate content issues are for whining SEO copywriters, that's why webmaster SEO is still the king and profitted from them since day 1. The SE are still far behind having the correct algo to filter the mess completely, and for the time being everyone is still reaping from dups, but who knows tomorrow? next month? next year?

    I am also waiting for the time when the real copycats whine too, that'll be fun cause I won't whine, I already made too much money off myself than worry about this thing. I must admit I dup a lot of content back in my early days, now I am wiser and going into a cleaner articles written through outsourcing. Clears my conscience a bit you know and keeps away the DCMA emails lolz
     
    MRniceGuy007, Aug 15, 2008 IP
  2. googlefrog

    googlefrog Peon

    Messages:
    145
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #22
    Google is not AI. Google doesnt know who made what and when, to do that it would need to have real time bots, and indexing it would take huge resources. It refers your own website. People think Google is some higher entity.
     
    googlefrog, Aug 15, 2008 IP
  3. abercrombie

    abercrombie Peon

    Messages:
    654
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #23
    i've seen entire sites made of duplicate content thrown out of the index. so it can't be good.
     
    abercrombie, Aug 15, 2008 IP
  4. izwanmad

    izwanmad Banned

    Messages:
    1,064
    Likes Received:
    14
    Best Answers:
    0
    Trophy Points:
    0
    #24
    well, please take a look at here:

    http://googlewebmastercentral.blogspot.com/2008/06/duplicate-content-due-to-scrapers.html

    and

    http://www.vanessafoxnude.com/2008/05/14/ranking-as-the-original-source-for-content-you-syndicate/


    google already said that the duplicate site will not ranking higher than the original . :)
     
    izwanmad, Aug 15, 2008 IP
  5. prodigenius

    prodigenius Peon

    Messages:
    192
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #25
    By that argument, you should copy Google. Just make a search engine in your spare time. Bam. Instant pagerank 10.

    (sarcasm)
     
    prodigenius, Aug 15, 2008 IP
  6. FightRice

    FightRice Peon

    Messages:
    1,082
    Likes Received:
    28
    Best Answers:
    0
    Trophy Points:
    0
    #26
    the amount of retardation in this forum and thread here lately is insane.
     
    FightRice, Aug 15, 2008 IP
  7. ninjacluster

    ninjacluster Peon

    Messages:
    132
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #27
    Nobody really knows how it all works. I mean, if you were trying to lower someone elses SEO rank, you would create a bunch of sites and use their content. I wouldn't think Google would allow someone else to potentially control your ranks. This scenario isn't likely to happen, but it's possible.
     
    ninjacluster, Aug 15, 2008 IP
  8. hasan_889

    hasan_889 Banned

    Messages:
    303
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #28
    Duplicate content is bad for SEO, but good for SEM
     
    hasan_889, Aug 15, 2008 IP
  9. MoneyMoose

    MoneyMoose Peon

    Messages:
    683
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #29
    I agree with you completely there.

    :D

    I'm not a proponent of duplicate content being better than original by any means.
     
    MoneyMoose, Aug 17, 2008 IP
  10. pittfall

    pittfall Peon

    Messages:
    6
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #30
    The biggest thing about duplicate content is that you have control over what is kept from search engine spiders. Use duplicate content for users, but exclude bots by using your robots.txt effectively.

    That is what the robots.txt is for. If you leave duplicate content for the engines to determine which is priority, chances are they will select a competitor.
     
    pittfall, Aug 17, 2008 IP