1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

robots.txt to deny all spiders for proxy?

Discussion in 'robots.txt' started by Mysmasken, Nov 29, 2007.

  1. #1
    Hi, I'm new to robots.txt but I realize I need to block some stuff on my proxies, that are getting overloaded.
    Is it okay to deny alla search engines to crawl my site? Is that stupid?
    Can I still use Google Adsense if I deny googlebot to get it?

    Does anyone have a good robots.txt with the most common bots listed?

    Thanks
    Stephanie
     
    Mysmasken, Nov 29, 2007 IP
  2. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #2
    of course you can NEVER deny access to Google mediabot and still want to earn from Google !!

    to deny anything in robots.txt makes little sense except for the truly "good" bots - only those "good" bots accept and follow the robots.txt rules

    for a full denial of of access it would be infinitely more efficient to use .htaccess and list all denied bots there. that is a rule that forces bots out whether they like or NOT.

    top deny the regular Googlebot access may eventually work - or NOT - independant of google mediabot. but such only makes sense if you want no new users at all.
     
    hans, Nov 29, 2007 IP
  3. whitelion

    whitelion Peon

    Messages:
    28
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Good info.
     
    whitelion, Nov 30, 2007 IP
  4. Mysmasken

    Mysmasken Active Member

    Messages:
    247
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    58
    #4
    Thanks for the info. It's proxies that are sent out via newsletters so no one will be coming from search engines.

    My server keeps crashing due to lack of memory, I have 512 RAM right now an donly 5 proxies running. I thought maybe denying some crawlers would help...
     
    Mysmasken, Nov 30, 2007 IP