Disappearing sites and why Google is all ... up

Discussion in 'SEO' started by MikeSwede, Apr 6, 2006.

  1. #1
    I have been posting about my site(s) disappearing from Google and about sites that illegaly uses scraped content from my sites are still there and I found this really interesting article about it, which tells me that I am not the only one that have seen this and that it is a HUGE problem for Google right now.
    From the article:
    "Worse yet, as Google tries to go after this duplicate content, they sometimes get the real company instead of the scraper. This is all part of the cat and mouse game that has become the Google algorithm. Once Google realized the manipulation that was happening, they decided to aggressively alter their algorithms to prevent it. After all, their goal is to find the most relevant results for their searchers. At the same time, they also faced huge growth with the internet explosion. This has led to a period of unstable updates, causing many top ranking websites to disappear while many sp@m and scraped websites remain. In spite of Google's efforts, every change seems to catch more quality websites. Many sp@m sites and websites that violate Google's guidelines are caught, but there is an endless tide of more sp@m websites taking their place." :mad:
    3. Duplicate Content Issues: These have become a major issue on the Internet. Because web pages drive search engine rankings, black hat SEOs (search engine optimizers) started duplicating entire sites' content under their own domain name, thereby instantly producing a ton of web pages (an example of this would be to download an Encyclopedia onto your website). As a result of this abuse, Google aggressively attacked duplicate content abusers with their algorithm updates. But in the process they knocked out many legitïmate sites as collateral damage. One example occurs when someone scrapes your website. Google sees both sites and may determine the legitïmate one to be the duplicate. About the only thing a Webmaster can do is track down these sites as they are scraped, and submit a sp@m report to Google. Another big issue with duplicate content is that there are a lot of legitïmate uses of duplicate content. News feeds are the most obvious example. A news story is covered by many websites because it is content the viewers want. Any filter will inevitably catch some legitïmate uses.

    Now I know why my sites have disappeared and why they won't come back until Google has fixed their buggy algo.... so I am not even asking to be re-included anymore since that is just for show and not for real anyway!! Just a place to vent while you write your useless email to Google asking to be re-included.... :mad:
     
    MikeSwede, Apr 6, 2006 IP
  2. andheresjohnny

    andheresjohnny Well-Known Member

    Messages:
    964
    Likes Received:
    31
    Best Answers:
    0
    Trophy Points:
    120
    #2
    You know, I wondered how that whole so-called "duplicate content penalty" could ever work.

    How could any program be absolutely sure which site was the original source?

    It would seem to me that a program would have to flag suspicious sites for a later human review. I can't see that happening with such a large number of sites (the earth).

    That sucks that you got screwed.
     
    andheresjohnny, Apr 6, 2006 IP
  3. wibr

    wibr Peon

    Messages:
    206
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #3
    The dup content filter doesn't work. AT least not for any SE that I know of. Which is why content/site scrapers are so popular and effective.

    When we see spammers stop using content scraping software then we'll know the dup filter actually works. Or the spammers found something better.
     
    wibr, Apr 6, 2006 IP
  4. GeorgeB.

    GeorgeB. Notable Member

    Messages:
    5,695
    Likes Received:
    288
    Best Answers:
    0
    Trophy Points:
    280
    #4
    I own a site. The content isn't "original" in that I didn't write it but the site displays the content in a way that people think is cool and link to it. Also my site has more of this particular content than anywhere else.

    So question: If my site does it better and users find my site a more relevant resource and "vote" for it by linking to it. Why shouldn't my site outrank older established sites?
     
    GeorgeB., Apr 6, 2006 IP