Hey, please check this article out http://www.seofaststart.com/blog/google-proxy-hacking now my question, who knows How To Fight Back? i know it's says in the article, but i need some help, i will pay whoever knows how to execute it! Thanks
I think the easy step was this but didn't last "we just blocked the proxies' IP addresses from accessing our server" The second step in my opinion but don't know how was to "block" Googlebot user-agent if find that are not assigned to Google (identify the real Googlebot user-agent) Maybe is something stupid but is how i understand
I don't think its as big a deal as they are making it out to be.....I have seen scrapers and proxies indexing my entire site with no penalties or reduction in rankings
Hi! With the introduction of “Big Daddy,†Google crawls from many different data centers; they also changed the algorithm substantially at the same time. It appears that the changes include moving some of the duplicate content detection down to the crawlers. 1. The original page exists in at least some of the data centers. 2. A copy (proxy) gets indexed in one data center, and that gets sync’d across to the others. 3. A spider visits the original, checks to see if the content is duplicate, and erroneously decides that it is. 4. The original is dropped or penalized. So … the problem is that if you flood Google with massive amounts of duplicate content, it exposes vulnerability. Eventually the algorithm makes a mistake, and your content is no longer authoritative.
How can I tell if it has happened to one of my sites? One of my sites has tanked in the rankings recently, for the second time in three months. It only ranks for my top keyword, and it has dropped from 2 or 3 to 19 for that. In addition, I have lost all of my secondary keyword rankings. If I search for a term that I used to rank highly for will I find my content but at another address, or will it just be gone?