Please till me how to stop or to limit some IP which download .rm and .mp3 files for mor than 10000 times ... I have two cases : 1. code 206 up to 20/second 5 GB or more than that ... 2. code 206 up to 20/second but less than 1 GB I used Ddos and mod_evasive20.so <IfModule mod_evasive20.c> DOSHashTableSize 3097 DOSPageCount 2 DOSSiteCount 4 DOSPageInterval 10 DOSSiteInterval 10 DOSBlockingPeriod 300 </IfModule>
Store the visitors/users IP in db along, with time(), file format, and 1 (default, count for downloads). Then everytime a file is accessed check if the visitors IP is in the db (if its not doing the above), and compare the current time with the stored time along with the count, if the count is not the maximum (10000), then proceed and do a query to add 1 (+1) to the count.
easier than db ... pseudo code if download requested if file creation time of iplist.txt >current time+5mins kill iplist.txt touch iplist.txt (creates empty dated file) endif load iplist.txt is ip of downloader in file? if so sleep 10 and serve apology please wait page if not then append his ip to the file and serve his download endif endif This wont inconvenience your current users.
Hmm what would be more effective writing to a file or storing to a db, in this case? - I'll leave that to others to decide...
temporary or permanent blocking would dictate which I would use. or are you talking about it as a speed issue? for ease of use , I'd go with da DB.
time stamp is definatley more time consuming in a text file. i'd agree shallowink. though i guess a different methodology than a time stamp could solve that.
Thank all of you. May you explain step by step where and how I will add pseudo code Note : I have Linux 2.6.24-26-server on x86_64
I take it that the script that handles the upload is in PHP - you can email it to webmaster at freeads0 dot com. If you would rather not send me whole script just give me the part that does the actual download. I will mail you back the mods and you can implement them.
Store the IPs in a file along with the current time. Then you can check the current IP against that file, and if the same IP is visiting again within the same second, it kills the page. What you could also do is to add users who access at least 5 times in a second to a black list, and never allow them on your site again. This is pretty much what matt suggested, just fleshed out the explanation more.
This is much more labour intensive , you have to write code to limit the size of the file, compare times, you have potentially a BIG file to load every time the code runs, a long search to do, and more code to kick out blacklisted users. what I do is simply list the ips and delete the file contents every couple of minutes, so nobody can download faster than one file every couple of minutes or so. It throttles back everybody to that speed which is no inconvenience to real users. The file size is automatically kept small - even if you have 10,000 visitors an hour it will only have 330 ips in it which takes no time to search. If you really want to ban users then you can simply add persistant offenders into a blacklist in htaccess and they dont even know a site is there. On my gallery site over 5 years I had four ip's like this and I think they were probably 'zombie' PC's
I used : order allow,deny deny from more than 100 IP ... deny from ... deny from ... allow from all Case 1 : IP send up to 20 requsts per secound for each file .... Download up to 3 MB for each file ... Server shows 206 code up to 3 houres ... It takes more than 5 GB in som cases it takes 230 GB , 40 GB , 10 GB Case 2 : IP send up to 20 requsts per secound for each file .... Download up to 2000 bytes ... Server shows 206 code up to 3 houres ... It takes less than 1 GB in many cases takes 0.1 GB 2000 hits for one files dears kindly help me
Some IP countinue to hit few files ... Server shows 403 code for more than 10,000 times They takes other files when I remove some files ...and so on ... Yes, I have big files up to 40 MB .rm & .mp3 I can not write any code ... I can not add your code ... I must try ... and try ... to solve this problem Many thanks may dear mattinblack
Ah right, that sounds like a much more efficient way to do it. Thanks for explaining. Looks like you may need to guide nonowa through your suggestion a little further?