questions about proxy hijack

Discussion in 'Reviews' started by trichnosis, Nov 6, 2007.

  1. #1
    Hi;

    i have a web site which is nearly 18 months old.

    my web site is getting great serp positions for 2-3 days in a month but when it gets the good serp points, i see that it's being hijacked from more than one proxy sites and because of this , it's loosing the serp again.

    i understand google does not care the search quality:mad:. google does not care the hijacking web by the proxy sites:mad: they have only focus on the adsense for getting more and more money.

    but how can i protect myself from this cheater without the help of google?

    Thanks for the comments
     
    trichnosis, Nov 6, 2007 IP
  2. webdev11

    webdev11 Peon

    Messages:
    52
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #2
    You can send alternate content to the proxies.

    Get their IP addresses with the Firefox ShowIP extension. Then use some conditional code in your site that says "if accessed from the proxy, send alternate content".

    Send a page with some inbound links to your Web site. :D
     
    webdev11, Nov 6, 2007 IP
  3. ForgottenCreature

    ForgottenCreature Notable Member

    Messages:
    7,473
    Likes Received:
    173
    Best Answers:
    0
    Trophy Points:
    260
    #3
    You can use a code to direct all frames to your main site.
     
    ForgottenCreature, Nov 6, 2007 IP
  4. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #4
    i have tried this with java script codes but all the proxy sites are removing the java script codes:(
     
    trichnosis, Nov 6, 2007 IP
  5. zangief

    zangief Well-Known Member

    Messages:
    1,722
    Likes Received:
    55
    Best Answers:
    0
    Trophy Points:
    155
    #5
    I think you can not fight against proxies cause the same is happening to me on one of my sites.
    This has been a problem for more than one year and as far as I see google can not find a solution to this.
    Maybe we should consider building a proxy :p
     
    zangief, Nov 6, 2007 IP
    trichnosis likes this.
  6. Slincon

    Slincon Well-Known Member

    Messages:
    1,319
    Likes Received:
    44
    Best Answers:
    0
    Trophy Points:
    180
    #6
    I'm working on something:
    http://www.seotest.uni.cc/googlebot-detect.php

    So far SEOEggHead has worked out a solution but it's very resource intensive - requires curl + sql table, and it's a little clunky. My way is pretty much the same only without ip blocking, just detecting Googlebot in the useragent and if the useragent is googlebot let it index the site, otherwise show nofollow meta tags.

    We don't care about whether it's faked or not because it makes no difference, we're just trying to let googlebot access the page (of course we could also do more checks, but for now unnecessary). The problem with mine and seoegghead's code is that proxies can remove meta tags and javascript so it becomes ineffectual.

    I however am working on an ip banning script I'll release for free, just a quick simple way of blocking these proxys and making sure they don't get indexed in the first place. It's working so far on a site I have.. I don't want to give anything way just yet ..
     
    Slincon, Nov 7, 2007 IP
  7. azuka

    azuka Peon

    Messages:
    4
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Hello silicon:

    I hope you can finish the script soon... I'm tired of these proxys... I've tried the SEOEggHead solution (the first one) but when I installed the code I got an error, by the way, the code:

    if(!stristr($_SERVER['HTTP_USER_AGENT'],'googlebot')) { echo ''; }
    Code (markup):
    Where can I place it?

    Thank you :)
     
    azuka, Nov 8, 2007 IP
  8. azuka

    azuka Peon

    Messages:
    4
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #8
    So, I don't understand... this code doesn't work (The solution of Proxy Hijack)

    <?php
    if (@fsockopen($_SERVER['REMOTE_ADDR'], 80, $errno, $errstr, 30))
    {
    exit('Your message comes here when people visit your with proxy ');
    }
    ?>
    Code (markup):
    :(
     
    azuka, Nov 8, 2007 IP
  9. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #9
    i'm currently using this code but this code makes googlebot activity slower. googlebot is visiting less pages than before with code i really hate proxy sites.

    i think there is only one solution with these cheaters (=proxy sites). i think google does not want to solve this 1 YEAR OLD PROBLEM:mad: and i believe the solution is making ddos to all of the proxy sites:D
     
    trichnosis, Nov 9, 2007 IP
  10. Slincon

    Slincon Well-Known Member

    Messages:
    1,319
    Likes Received:
    44
    Best Answers:
    0
    Trophy Points:
    180
    #10
    sorry didn't catch this post until I checked my usercp just now. That's not the full code, here's what's currently being used (place this in between your head tags (where all your meta tags are):

    <HTML>
    <HEAD>
    <?
    if(!stristr($_SERVER['HTTP_USER_AGENT'],'googlebot')) { 
    echo '<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">'; 
    }
    ?>
    </HEAD>
    <BODY><BODY>
    </HTML>
    
    Code (markup):
    What it does is sniff the user agent, if it says its googlebot don't do anything- otherwise show a noindex,nofollow (which is a meta I believe only Google uses). If it's not Googlebot, it adds a meta tag to stop index or the page from being cached. The problem here is that currently there's only 2 ways to stop google from spidering - 1) robots.txt (you won't have access to this on the proxy server), and 2) meta tags. Some proxies have the ability to remove meta tags, and if someone does that to you - this probably won't work unless we obfuscate the meta tags - but that's risky as we want Google to recognize and read it. I have another script that I'm working on that's a little more advanced but can stop this meta problem, i'll put the code out once I know it's working for sure.

    lol yea I wish it was that simple, the code you put is faulty and I wouldn't use it since what its doing is getting the ip of the user, then attemping to open a connection to that ip via port 80 (http port - also common to use amongst open proxies (don't think web proxies have port 80 open?)) then if it can connect it assumes its a proxy, if it can't it lets the user through. The problem is this will work for real proxies, but it won't work for web proxies because web proxies are scripts and don't work with open ports. You'll only be detecting whether a user is using a proxy.

    Also every visit to your site means that your server is opening a connection which slows your site down and leads to unnecessary requests.
     
    Slincon, Nov 12, 2007 IP
  11. ernestjev

    ernestjev Banned

    Messages:
    44
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #11
    Proxy cant hijack your domain or site , you mean hijacking search results ? Many webs using meta tags as : winamp, imb.com , and so on lol.
     
    ernestjev, Nov 12, 2007 IP
  12. Fisker

    Fisker Well-Known Member

    Messages:
    1,857
    Likes Received:
    26
    Best Answers:
    0
    Trophy Points:
    175
    #12
    What do you mean by hijacking ?? How can proxies be hijack ??
     
    Fisker, Nov 12, 2007 IP
  13. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #13
    i suggest you to read http://www.seofaststart.com/blog/google-proxy-hacking .

    proxy sites are the new way of cheating.:mad:
     
    trichnosis, Nov 12, 2007 IP
  14. Slincon

    Slincon Well-Known Member

    Messages:
    1,319
    Likes Received:
    44
    Best Answers:
    0
    Trophy Points:
    180
    #14
    they can cause your site or pages of your site to be bumped as dupe content. This includes pages that have been indexed by Google, what happens is Google re-spiders pages every now and then ..and when it crawls one of these proxy caches it ends up thinking this is the original site, and when it gets to your site it seems to misinterpret and think that your content is dupe content and penalizes you for it (supplementary).

    I don't want to create any unnecessary fear, but Matt Cutts has discussed it - but there's no real word from Google on what they plan to do, so it is still a viable solution for black hat seor's.

    SEOEgghead has released something for it, but as I said before it's long-winded and resource intensive. I've tried to write something (which I've posted above), and I'm working on something a little more helpful (since the above code is very simple and will only really stop the most basic of proxy caching - but it's a start nonetheless. If you want to help me, you can link to this: http://www.seotest.uni.cc/googlebot-detect.php (google's taking forever to index it) - I'm testing whether it works or not, I have two scripts I'm testing (the one I linked to is the basic one that I've posted the code to here) The other I'm testing elsewhere so as to keep them separate. I'll post it when I'm sure that it actually works - we really need a simple solution for this, and since Google's ignoring the 8 pound gorilla in the room - we'll have to help each other.
     
    Slincon, Nov 12, 2007 IP
  15. stickycarrots

    stickycarrots Peon

    Messages:
    4,513
    Likes Received:
    115
    Best Answers:
    0
    Trophy Points:
    0
    #15
    the safe thing to do is close your site and go home
     
    stickycarrots, Nov 12, 2007 IP