Well, I had asked a questions here and the solution one person gave to is to add lot's of blocking code to be added in my .htaccess. Now already there is lot's of mod_rewrite code in my .htaccess so if I put that much code, won't it slow down my site considerably ? If yes than what to do ? if NO, than..great..
If you're concerned about load and aren't on shared, you can put any htaccess code (paths and stuff may need changing) in your httpd.conf which should speed it up. The code you are considering will be absolutely fine. It's just comparing the useragent against a list of known bots. There's no regex, no checking if I file exists or anything like that. Also I'd say it's not just the size of it that matters, it's the complexity - e.g. RewriteRule somepagename.html otherpage.html Code (markup): ought to be relatively easy for the server to handle. I don't know enough for a definitive answer but I'd imagine a few hundred lines of that would be fine. On the other hand, I'd be very wary about having that many versions of something like: RewriteCond %{HTTP_HOST} ^(www\.)?([0-9a-z-]+).domain.([0-9a-z\.]+)$ [NC] RewriteCond %{REQUEST_URI} !-s RewriteCond %{REQUEST_URI} !-d RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} !^friend1@client1.quux-corp\.com$ RewriteRule .. and so on Code (markup):