I heared once about ways to do it, maybe you could help me with it. I want to prevent downloading my site (with offline browsers or bots that collect the info from the pages), Because it's a lot of html pages that can easily be downloaded and changed. The way to do it if i remember correctly is to prevent ip's from downloading more than X megs in a certain period of time. If it reaches this amount - which means that it's not logical for a normal user of the site, and the site is actually being downloaded - the ip will be blocked. So, is there any script that do it or any other method ? Thanks, E-A.
Is there really a need to prevent offline browsing? If someone really wants to scrap your content, they will find some other way to beat your protection. I am not 100% sure but won't you be blocking spiders also if you implement this?
You are playing with fire when using an automated system like this. Wouldn't a better solution be to get a notification and then decide whether to ban the ip? There are legitimate reasons that both bots and visitors may be using a lot of bandwidth. An automatic ban will create problems for your good users while the ones scraping will get around it