Hey, i have a web proxy network, and im just wondering if there would be anything to put, like in your Htaccess file to automaticly redirect crawlers to another page or something so they wont pick up your site as a web proxy, mainly school filters, they will go to crawl your site, get redirected to a page about ponies and wont block you thanks
Haha. If you don't care about traffic from search engines then you can block them via robots.txt and use redirection rules based on their hostname/useragent. This might not be very efficient though because large search engines will recrawl your website without identifying their bots. They do it to recognize cloaking.