You would think Twitter and facebook having such massive servers to handle millions of people would never face a DDoS problem. Since they all have "rate limiters" to handle too many requests. You do know everyone's sites are in danger right? Especially those on shared web hosts. There are many evil web developers who will use DDoS to beat you on google SERPs. Twitter and Facebook DDoS Hacked ( Digg Here ) This must have been a very organized attack from a very large botnet. What can you do? Create scripts that track the amount of visitors currently online in the last few minutes, if it becomes too high, say 900+ in 5 minutes? Disable resource intensive modules or codes / queries.
and do what if you have over 900 visitors in 5 min? Shut down the website? Doesn't that count as denial of service its self? The whole point is to survive and keep the service online even when under attack. Keep in mind that some websites experienced ddos attacks for months. How to beat them? Have loads of cash ready to buy extra bandwith and protection from your datacenter, or move to bigger datacenter if they can't handle it. The solution is less technical and more financial for this. no such thing. I explained above why. rofl. A ddos attack is far from anything that can be called a hack. Don't read too much in that article. The author doesn't seem to know much about the subject.
Excuse me, but I work at a security company, don't talk trash you don't know what you're talking about. Where do you get the nerve to insult someone's work without showing any credentials. I didn't know you were the authority on what to call a hack... http://en.wikipedia.org/wiki/Hacktivism http://en.wikipedia.org/wiki/Hacking_(computers) http://en.wikipedia.org/wiki/Exploit_(computer_security) Please learn something before talking. And YES there are rate limiters. Counting the amount of requests made per minute by certain IP sources can be used to block someone who has 64 threads that are continuously using flooding programs or are infected. Yes, LOTS of servers and LOTS of CASH to buy SERVERS, can be a preventative method to defend against DDoS attacks, but it's not the only way, and it's certainly not the cheapest way either (especially since the advice I was giving was for WEB DEVELOPERS who may not be able to afford dedicated servers) -- So you're wrong about how it's not a technical subject. In a DDoS attack, certain unnecessary features should be disabled or if possible cache versions should be sent out instead of requerying. Apparently, you have no clue about throttling against DDoS attackers. Obviously, with attacks that last for months, better defenses should be employed as shutting down certain services / features of your website, is only a temporary solution to prevent your whole site from crashing completely.
I think pitagora hurt your feelings with his opinion. You know, you probably shouldn't post things if you hold them so dear. He was wrong to say it's not 'hacking' but 'hacking' usually involves access to the files within the server; not just a flood.
I read the article as well. Have to admit that I wouldn't call rate limiting at the site level a "preventative method to defend against DDoS attacks" that I would recommend to my clients. While it is a step and something that needs to be in place, taking action at either the rack level with either a hardware firewall or a software based one built into the router(s) or the switch(s) would be a better method. Creating scripts isn't necessary when they already exist and are well tested. I do webhosting for a living. I learned a long time ago that having decent servers and not overloading them goes a long way to keeping clients happy. We get slashdotted or promoted by another form of "mass media" about 3-5 times a day. We go through this constantly. 900 visitors in 5 minutes isn't that much and with prep, you should be able to handle it. Having all the servers in the world isn't going to stop an extreme attack. Even a well known blogging site with their 3 datacenters and racks and racks of servers gets booted offline every so often.
^ fail his replies FAIL Brian Albright is also made of fail too meybe if you used that brain of yours common sense would be a brewing. but no. So much for that _security job_ of yours all n all its just a hype the media can eat up and feed the public. When in reality its just some russians being angry at someone from georgia and then theres the lovely spoofd syn flood. it gits to a point where ids and filters get splooged on that they cant handle it anymore dont be a whitehat from gerogia or pull a mitnick and blame your host time after time jajajaj need i say more twit got ddos majorly for the first time the media found out well cuz twit is soo dam popular nowadays, did there silly little trolling to feed you LOL facts about the story and now the public are even more scared of botnets and ddos and all this hyped up bullshit long live project mayhem
Hey, learn English first. You are an uneducated moron. Thank you for your detailed and professional experience on syn floods rofl!!! Did you drop out of high school? Now a full-time script kiddie? Go somewhere where someone actually cares what you think.
You're right, in the end if someone wants to DDoS your site, it's going to go down, you cannot prevent it. BUT THAT DOESN'T MEAN, you shouldn't do anything. Many developers cannot afford firewalls, professional scripts, don't have access to their own servers, and don't have funds to buy datacenters. This is the people I was giving ideas too. In fact, rate limiting and disabling of modules on detection of DDoS attack (or just a digg effect), is a standard among many developers. Some CMS programs also implement modules for it.