5 out of 6 businesses struggle daily with low profile DDoS attacks that consume their bandwidth and resources and pose a burden, resulting in poor service level and customer experience
You know how when you get to a certain age, feeling ‘good’ is not good enough? Well it might be good for your everyday life – obviously, you don’t need to extract the most out of your brain and muscles for the day-to-day to-do’s, but there is no guarantee that there is nothing there that negatively impacts your performance, or may be silently growing.
In the information age, businesses rely more than ever on the speed and accuracy of data. It needs to be secured, and it needs to be clear for whoever receives it (man or machine). While most of us who generally feel ‘good’ do not worry so much about health checks – certainly not on a daily basis – as long as we manage to do what we need to do, IT teams invest most if not all of their time making sure the information system performs at its best. Why? Because time is money.
Following information-security media outlets for the past year, one may think that there are IoT botnets in every corner bringing enterprise networks down on their knees, but the reality is different. Despite the record-breaking volumes we have seen in 2016, non-volumetric DDoS is still prevalent. This denial-of-service technique is still proven to be very efficient in exhausting network and server resources. Moreover, a non-volumetric attack can evade detection mechanisms and consume bandwidth and resources without the target knowing—affecting service-level quality.
56% of internet traffic is bots
Some are good and regulated for the most part, like search engine crawlers, automated trading and instant media updates – yet a significant part of internet traffic is generated by bad bots – from spammers and click-fraudsters to vulnerability scanners and malware spreaders. Traffic used to form DDoS attacks falls under that category as well.
Attack Size: Does It Matter?
Not everyone can generate such astonishing amounts of traffic. In 2016, less than 10% of server attacks qualified as extra-large (10Gbps or higher). Seven in 10 server attacks were below 100Mbps, of which 50% were 10Mbps or less. Despite the notorious IoT botnet attacks, those ranging from 10Gbps to 50Gbps decreased from 8% in 2015 to 3% in 2016. Why?
DDoS attackers are becoming more sophisticated and more familiar with the security solutions in the market today. They know that most protections limit the rate (with or without being able to distinguish between legitimate user traffic and bad traffic) so they choose other techniques (for instance, low-and-slow attacks or alternatively, short bursts). Perpetrators – be those hackers, hacktivists, business competitors etc. – do not always intend to completely take the network down, but understand that they can cause a sensible impact by launching attacks of a lower profile that in many cases the targeted organization will absorb without even noticing or knowing.
Three in five respondents report a cyber-attack that is 10 million packets-per-second (PPS) or less, and about one fifth indicated they suffered an attack between 10 million PPS and 100 million PPS. The number of attacks that were 100 million PPS or less increased from 76% in 2015 to 82% in 2016. Those with 10 million PPS or less increased from 50% in 2015 to 63% in 2016.
Under such conditions, unfortunately, the network is not at its optimized performance. Consequently, the business is not at it top performance. While it may run well enough for the most part – sometimes feeling good can be deceiving. Even if most of the information flows accurately and securely, resources of the organizations are still consumed, whether this is a share-of-pipe serving dirty traffic or man hours in log analysis. Users today expect a prompt response of any application or web page. If the attack doesn’t trigger any control, it goes below the radar and the company doesn’t always understand that they are not at their best.
(Healthy) Food for Thought:
To avoid losses of time, revenue and reputation, business should be aware of the situation and try to assess what is the impact of rubbish traffic on their operation, and what their customers experience. It is clear that these attacks do not reach the threshold of most rate-limit DDoS protection, and therefore go undetected. Traffic purification can only be done by a DDoS solution that leverages a behavioral analysis algorithm that learns baselines and patterns of legitimate requests in peacetime, and maintains the peace by cleaning unwanted requests as they come in.
Getting the network health back into shape will immediately translate into a better performance, and a ‘good’ feeling.
Read the 2016–2017 Global Application & Network Security Report by Radware’s Emergency Response Team.
Ben Zilberman is a product marketing manager in Radware’s security team. In this role, Ben works closely with Radware’s Emergency Response Team to raise awareness of high profile and impending attacks. Ben has a diverse experience in network security, including firewalls, threat prevention, web security and DDoS technologies.
Prior to joining Radware, Ben served as a trusted advisor to Checkpoint Software technologies where he led partnerships, collaborations, and campaigns with system integrators, service, and cloud providers and implemented best practices of security design and management. He also serves as a commercial instructor and public speaker. He holds a BA in Economics and a MBA, from Tel Aviv University.
*** This is a Security Bloggers Network syndicated blog from Radware Blog authored by Ben Zilberman. Read the original post at: https://blog.radware.com/security/2017/06/network-high-cholesterol/