Hi, I'm new to robots.txt but I realize I need to block some stuff on my proxies, that are getting overloaded. Is it okay to deny alla search engines to crawl my site? Is that stupid? Can I still use Google Adsense if I deny googlebot to get it? Does anyone have a good robots.txt with the most common bots listed? Thanks Stephanie
of course you can NEVER deny access to Google mediabot and still want to earn from Google !! to deny anything in robots.txt makes little sense except for the truly "good" bots - only those "good" bots accept and follow the robots.txt rules for a full denial of of access it would be infinitely more efficient to use .htaccess and list all denied bots there. that is a rule that forces bots out whether they like or NOT. top deny the regular Googlebot access may eventually work - or NOT - independant of google mediabot. but such only makes sense if you want no new users at all.
Thanks for the info. It's proxies that are sent out via newsletters so no one will be coming from search engines. My server keeps crashing due to lack of memory, I have 512 RAM right now an donly 5 proxies running. I thought maybe denying some crawlers would help...