Google Proxy Hacking: How A Third Party Can Remove Your Site From Google SERPs

Discussion in 'Google' started by skionxb, Nov 15, 2007.

  1. #1
    skionxb, Nov 15, 2007 IP
  2. newzone

    newzone Well-Known Member

    Messages:
    2,865
    Likes Received:
    52
    Best Answers:
    0
    Trophy Points:
    135
    Digital Goods:
    1
    #2
    I think the easy step was this but didn't last "we just blocked the proxies' IP addresses from accessing our server"
    The second step in my opinion but don't know how was to "block" Googlebot user-agent if find that are not assigned to Google (identify the real Googlebot user-agent)
    Maybe is something stupid but is how i understand :confused:
     
    newzone, Nov 15, 2007 IP
  3. Waffleguy

    Waffleguy Well-Known Member

    Messages:
    556
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    108
    #3
    I bet people on this forum are gonna try this now.
     
    Waffleguy, Nov 15, 2007 IP
  4. itzCarlin

    itzCarlin Active Member

    Messages:
    849
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    60
    #4
    i will :p tomrrow, gotta finish studying ap biology and ap us history
     
    itzCarlin, Nov 15, 2007 IP
  5. oseymour

    oseymour Well-Known Member

    Messages:
    3,960
    Likes Received:
    92
    Best Answers:
    0
    Trophy Points:
    135
    #5
    I don't think its as big a deal as they are making it out to be.....I have seen scrapers and proxies indexing my entire site with no penalties or reduction in rankings
     
    oseymour, Nov 15, 2007 IP
  6. Danny.Antonio

    Danny.Antonio Peon

    Messages:
    104
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Hi!
    With the introduction of “Big Daddy,” Google crawls from many different data centers; they also changed the algorithm substantially at the same time. It appears that the changes include moving some of the duplicate content detection down to the crawlers.

    1. The original page exists in at least some of the data centers.
    2. A copy (proxy) gets indexed in one data center, and that gets sync’d across to the others.
    3. A spider visits the original, checks to see if the content is duplicate, and erroneously decides that it is.
    4. The original is dropped or penalized.

    So … the problem is that if you flood Google with massive amounts of duplicate content, it exposes vulnerability. Eventually the algorithm makes a mistake, and your content is no longer authoritative.:rolleyes:
     
    Danny.Antonio, Nov 16, 2007 IP
  7. Pahrump Mike

    Pahrump Mike Life Is Good

    Messages:
    190
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    123
    #7
    How can I tell if it has happened to one of my sites? One of my sites has tanked in the rankings recently, for the second time in three months. It only ranks for my top keyword, and it has dropped from 2 or 3 to 19 for that. In addition, I have lost all of my secondary keyword rankings. If I search for a term that I used to rank highly for will I find my content but at another address, or will it just be gone?
     
    Pahrump Mike, Nov 20, 2007 IP
  8. sussane

    sussane Active Member

    Messages:
    611
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    58
    #8
    one of my blog is victim of this shltty proxy hacking :(. HOw do i get back ?
     
    sussane, Nov 20, 2007 IP