Google to ban aggregating and site scrapers

Discussion in 'Google' started by thefonz22, Jan 25, 2011.

  1. #1
    Have been reading around the web that Google is going to ban websites that link to other sites without having useful info themselves. eg aggregator sites... Is this the end of RSS? Does this mean that if you legally display another sites RSS feed you can be penalised?

    source
     
    thefonz22, Jan 25, 2011 IP
  2. Andreon

    Andreon Peon

    Messages:
    46
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    It's quite easy to distinguish an RSS feed from a regular site though since RSS feeds always sent the version code along, like "RSS2" or something.

    edit: So no, I don't think this means the end of RSS feeds
     
    Andreon, Jan 25, 2011 IP
  3. thefonz22

    thefonz22 Peon

    Messages:
    22
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Yeah i agree would google want to ban rss aggregator sites.. what were they talking about then? Are they talking about sites that just rip other sites content off?
     
    thefonz22, Jan 25, 2011 IP
  4. thefonz22

    thefonz22 Peon

    Messages:
    22
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    but Matt Cutts did pretty much describe RSS sites as the type of sites it's looking to blacklist. eg non original, scraper, low quality etc etc
     
    thefonz22, Jan 25, 2011 IP
  5. no69_2007

    no69_2007 Well-Known Member

    Messages:
    1,145
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    105
    #5
    I think these sites need to be excluded, just creating hurdles for others.
     
    no69_2007, Jan 25, 2011 IP
  6. MeBloggs

    MeBloggs Peon

    Messages:
    159
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    It'll only affect sites that don't deserve to be seen.
     
    MeBloggs, Jan 25, 2011 IP
  7. FireStorM

    FireStorM Well-Known Member

    Messages:
    2,579
    Likes Received:
    88
    Best Answers:
    0
    Trophy Points:
    175
    #7
    Google should have done this since long time
     
    FireStorM, Jan 25, 2011 IP
  8. DanAbbamont

    DanAbbamont Guest

    Messages:
    330
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Don't count on them going anywhere soon. If they have some major breakthrough they might be able to devalue them to some extent, but I'm pretty skeptical.
     
    DanAbbamont, Jan 25, 2011 IP
  9. rehash

    rehash Well-Known Member

    Messages:
    1,502
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    150
    #9
    Right now it is common for sites with duplicate content to rank better than those with the original content. This means google can't always tell who copied from who.
     
    rehash, Jan 26, 2011 IP
  10. Zachmo

    Zachmo Peon

    Messages:
    221
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #10
    yes and it is unfair for those original makers.
     
    Zachmo, Jan 26, 2011 IP
  11. JamesColin

    JamesColin Prominent Member

    Messages:
    7,874
    Likes Received:
    164
    Best Answers:
    1
    Trophy Points:
    395
    Digital Goods:
    1
    #11
    I've seen my french digg like site got penalty on 21st october, it lost 80% of the traffic from google overnight. Strangely enough the english version of the site hasn't been penalized yet, quite the contrary indeed.
     
    JamesColin, Jan 26, 2011 IP
  12. MartinPrestovic

    MartinPrestovic Peon

    Messages:
    213
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #12
    Google, or more specifically Matt Cutts didn't say in his blog post that Google was going to ban anyone or any specific type of site.

    He stated that they have already released an update to a document classifier which will make it harder for spam content to rank highly. Think the blog post pages which are filled with endless repeated junk comments. He also said that they have improved their ability to detect hacked sites (which as a programmer, I find incredible that it has taken them this long to do, as it really is quite easy).

    What they are evaluating next is the web sites which copy other people's content and have low levels of original content. So pure spammy sites that have nothing but RSS feeds are likely to be hit in the future, but it won't affect those who have a good quality web site and use other people's RSS feeds.

    Just my interpretation...
     
    MartinPrestovic, Jan 26, 2011 IP
  13. serenarichard

    serenarichard Peon

    Messages:
    80
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #13
    This will not be done immediately but sooner or later but still its under observation. This might be a very hard decission for google and RSS is publicly used and every subscriber or blog owner will be disturbed by this decission.
     
    serenarichard, Jan 26, 2011 IP
  14. Reliable-SeoServices

    Reliable-SeoServices Peon

    Messages:
    441
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #14
    Thank you for making it clear. I have also read official blog of Google and no where they have told that they will ban those sites which are using RSS.
     
    Reliable-SeoServices, Jan 26, 2011 IP
  15. ReverseInternet.com

    ReverseInternet.com Peon

    Messages:
    17
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #15
    I guess spammers should post an article or post somewhere and announce more advanced content spamming algorithm they will produce in response to google update :)
     
    ReverseInternet.com, Jan 26, 2011 IP
  16. DanAbbamont

    DanAbbamont Guest

    Messages:
    330
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #16
    He said they're evaluating a way to reduce copied content, but that's really hard. There's no reliable way to tell which copy is the original.

    Word on the street is the new change has something to do with grammar, which makes sense to me. The easiest way to identify spam content right now is to evaluate the English.
     
    DanAbbamont, Jan 26, 2011 IP
  17. thefonz22

    thefonz22 Peon

    Messages:
    22
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #17
    But why would an RSS aggregator be punished when they are merely publishing articles that are from owners that want their content distributed? If you are a website owner and dont want your content being used on other sites why broadcast the content via RSS?
     
    thefonz22, Jan 27, 2011 IP
  18. thefonz22

    thefonz22 Peon

    Messages:
    22
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #18
    exactly.. how will sites like digg be affected?
     
    thefonz22, Jan 27, 2011 IP
  19. chandan123

    chandan123 Prominent Member

    Messages:
    11,586
    Likes Received:
    578
    Best Answers:
    0
    Trophy Points:
    360
    #19
    because they list other sites links more
     
    chandan123, Jan 27, 2011 IP
  20. no69_2007

    no69_2007 Well-Known Member

    Messages:
    1,145
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    105
    #20
    but digg is still good :S
    digitalpoint have thousands of links, still good.
     
    no69_2007, Jan 27, 2011 IP