how are you beating the duplicate filters with your feeds?

Discussion in 'Affiliate Programs' started by skattabrain, May 16, 2005.

  1. chachi

    chachi The other Jason

    Messages:
    1,600
    Likes Received:
    57
    Best Answers:
    0
    Trophy Points:
    0
    #21
    See www.copyscape.com . It is easier than you think :)
     
    chachi, May 18, 2005 IP
  2. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #22
    i have a few of these, some are holding steady, others aren't.

    i have pure aws content sites, sites with extra related content added to the aws pages and still others had aws added to an existing non-aws sites in an effort to add more pages / generate some industry specific book sales (if i had a 20 page site on cosmetics, i would add a couple thousand cosmetic/health/beauty books to the site.

    now i've lost pages on some and not on others but i wish it was a cut and dry as the sites with extra content hold out and the others don't but i can''t say that's happening.

    matter of fact ... that cosmetic site i spoke got the dup filters stirred up so badly that it removed the non-aws pages too!! and these pages were all clearly unique. after this experiece, i can't recommend anyone do this ... add a few books ... but go adding a couple thousand no matter how "targeted" the books are. it's just so easy with aws.

    ferret77 has mentioned to me in the past that it might be because in googles eyes, 95% of the content was duplicate. ie - 95% aws content, 5% my own ... easy to dilute a 20 page site to nothing when you mix in so many books.
     
    skattabrain, May 18, 2005 IP
  3. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #23
    1. you can't get much more better with meta than what i already do. ** i think
    2. static urls
    3. i even have EXTREMELY relevant RSS Feeds

    here is a sample from a book (reference - http://www.amazon.com/exec/obidos/tg/detail/-/1558740708/qid=1116463290/sr=8-1/ref=pd_csp_1/002-4942272-6069668?v=glance&s=books&n=507846 )

    <title>MRB: Lifeskills for Adult Children</title>
    <meta name="description" content="Lifeskills for Adult Children was written by Janet Woititz / Alan Garner and is published by HCI." />
    <meta name="keywords" content="Lifeskills for Adult Children, medical, Janet Woititz / Alan Garner, book reviews" />
    
    Code (markup):
    same page constructor with different book ...
    ref - http://www.amazon.com/exec/obidos/A...63495/sr=11-1/ref=sr_11_1/002-4942272-6069668

    
    <title>MRB: Jumpin' Johnny Get Back to Work! : A Child's Guide to ADHD/Hyperactivity</title>
     <meta name="description" content="Jumpin' Johnny Get Back to Work! : A Child's Guide to ADHD/Hyperactivity was written by Michael Gordon Ph.D. and is published by GSI Publications." />
     <meta name="keywords" content="Jumpin' Johnny Get Back to Work! : A Child's Guide to ADHD/Hyperactivity, medical, Michael Gordon Ph.D., book reviews" />
    
    Code (markup):
     
    skattabrain, May 18, 2005 IP
  4. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #24
    easy ... i want the sales. this is about beating the game ... seo is a game.


    sorry about the triple post .... trying to address the comments.
     
    skattabrain, May 18, 2005 IP
  5. chachi

    chachi The other Jason

    Messages:
    1,600
    Likes Received:
    57
    Best Answers:
    0
    Trophy Points:
    0
    #25
    I would agree with Ferret77 on this one. It appears that there is some threshold for a site. If there are 1,000 pages, then G would like to see x% of them be original. I am not sure exactly what that percentage is, but it certainly appears that way to me.
     
    chachi, May 18, 2005 IP
  6. iShopHQ

    iShopHQ Peon

    Messages:
    644
    Likes Received:
    33
    Best Answers:
    0
    Trophy Points:
    0
    #26
    detecting duplicate content isn't the issue - easy to compare two pages, or 200, or 2000. The issue is figuring out which stays and which goes. How do they know what's the dupe and what's the original?
     
    iShopHQ, May 18, 2005 IP
  7. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #27
    :confused: you lost me ... no it's not a duplicate problem, yes it is a duplicate problem?

    can you elaborate? i think i misunderstand you.
     
    skattabrain, May 18, 2005 IP
  8. l234244

    l234244 Peon

    Messages:
    1,225
    Likes Received:
    50
    Best Answers:
    0
    Trophy Points:
    0
    #28
    Surely if you prevent the duplicate content from being detected there is no need to worry about what stays and what goes?
     
    l234244, May 19, 2005 IP
  9. iShopHQ

    iShopHQ Peon

    Messages:
    644
    Likes Received:
    33
    Best Answers:
    0
    Trophy Points:
    0
    #29
    Why worry about preventing something that in all liklihood is not being looked for?
     
    iShopHQ, May 19, 2005 IP
  10. davedx

    davedx Peon

    Messages:
    429
    Likes Received:
    21
    Best Answers:
    0
    Trophy Points:
    0
    #30
    They'd drop the duped content with the lower PR, I'd imagine... like with 302 redirects...
     
    davedx, May 19, 2005 IP
  11. Tuning

    Tuning Well-Known Member

    Messages:
    1,005
    Likes Received:
    51
    Best Answers:
    0
    Trophy Points:
    138
    #31
    I just recoved from dup filter by the following way.

    This is just I found on one of my site. I redesigned the site ( not much simple navigation only ) and changed the title tag. There was 3 keyphrases seperated by "-" before and now I changed to 2 keyphrases seperated by "|". Astonishingly within 3 days, google picked up the page and now I'm ranking #30 for the term. The page was at infinity for last couple of months.

    Does anyone have similar experieces ?

    Regards,
    Tuning
     
    Tuning, May 22, 2005 IP
  12. skattabrain

    skattabrain Peon

    Messages:
    628
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #32
    yes ... i had G start to wipe out a site ... no actually ... it did wipe it out. i changed up my title tags, h1 tage and meta tags ... within 7 days it had taken my entire amazon store out of the filter ... but it only lasted for a while before it kicked in again.

    i wouldn't mind rearranging things every month, however no matter what i do now, google jsut isn't comsuming all my pages anymore.

    it used too ... but i can't seem to resuscitate it now. we'll see how the next few weeks go and i'll keep you posted.
     
    skattabrain, May 22, 2005 IP
  13. donnareed

    donnareed Peon

    Messages:
    340
    Likes Received:
    9
    Best Answers:
    0
    Trophy Points:
    0
    #33
    True, if that was the case, then a competitor could use dup content to get rid of you in the serps. And, what if the person who posted the article first was not the author, and then later on the original author put it up on his web site and couldn't get a decent ranking? How fair is that?

    I tend to think that while it may exist in some form, calling it a duplicate content filter is an oversimplification. That's not to say that trying to make your datafeed site stand out from the others is a waste of time. I even vary the content page to page within a site; some pages will have RSS feeds and AWS feed, some will have Searchfeed links plus Overstock feeds, etc. But that's more just my shotgun-style SEO approach of throw lots of stuff against the wall and see what sticks.
     
    donnareed, Jun 3, 2005 IP