Pandemic Leads to Increase in Human-Like Attacks

A new study shows a marked increase in cyberattacks that mimic human behavior

The pandemic didn’t just send millions of workers from their corporate offices to their dining room table; it also created an increase in overall online transactions. Research from Adobe Analytics found that consumers spent an extra $107 billion between March and August because of the lockdowns and concerns about going into brick-and-mortar stores.

This has required businesses to rethink and reshape their business models to improve the e-commerce experience. Not surprisingly, however, cybercriminals are going to follow the action, and according to a new study from NuData Security, they are turning to human-like cyberattacks as a more sophisticated way to commit fraud.

“Human-looking or sophisticated attacks, those that focus on quality instead of volume, continue to increase,” the report stated. “Over the last six months, NuData found that almost all attacks against financial institutions were sophisticated attacks.”

Understanding Human-Looking Attacks

“Human-looking attacks emulate human behavior during a web transaction, but originate from a computer program or script. They attempt to evade technical countermeasures that organizations deploy to frustrate or block attackers that use normal, high-volume scripted attacks to perform malicious actions on a website or using mobile applications,” explained Robert Capps, vice president of market innovation at NuData Security, in an email conversation.

Some human characteristics that are often emulated are typing rate, the speed between page interactions, the emulation of mouse movement, page scrolling and browser identifiers, he added. The human-looking attacks can also do things such as solve CAPTCHA and get by security layers designed to tell a human from a bot.

“For organizations that lack sophisticated controls for automation, human emulation can create havoc for fraud and security evaluation controls, allowing for high-risk interactions to occur uninterrupted,” said Capps.

The Impact on These Attacks

According to the study, 96% of attacks on financial institutions were human-looking. There was also a 55% increase in mobile high-risk traffic. In addition, there are clear increases of human-looking cyberattacks from 2019 to 2020 in e-commerce, streaming and travel industries.

“The pandemic has opened up a number of opportunities for fraudsters and cybercriminals to blend into the increased volume of online consumer interactions,” said Capps. “COVID-19 has forced many users to transact online for banking and retail transactions, and has greatly increased the adoption of streaming media, gaming and collaboration services.”

The impact of human emulating automation is wide-ranging, but it will affect individual organizations and industries in vastly different ways. For instance, the report found that most attacks across all industries happened at login with account takeover attacks (ATOs). However, attacks in travel were more evenly distributed, with nearly half of attacks happening at checkout and almost a third of e-commerce attacks taking place at account validation pages, where customers can access information such as booking details, number of reward points, order status or account profile.

“Such attacks can be used for new account creation to submit false reviews for products, collect rewards and incentives for new account generation, to facilitate free trial abuse or as a first step toward fraudulent commerce transactions at a later date,” Capps said.

“Human emulation is also used to bypass many basic anti-automation controls in place to protect login pages at financial institutions, service providers, retailers, etc, in order to test and utilize stolen consumer account credentials from data breaches—eventually turning in to account takeover attacks,” he added. “We’ve also witnessed such techniques to bypass automation controls in place around the purchase of limited supply goods such as designer clothes, shows or even event tickets.”

What Organizations Can Do

Capps advises organizations facing human-emulating automation to be aware that they likely have a problem, even if it doesn’t result in immediate losses to their bottom line. “There are a number of financial impacts that stem from automated interactions, such as an increase in costs to support the computing infrastructure required to service these high-volume and low-value transactions, payment processing costs resulting from validating new credit cards added to accounts using automated scripts, and customer support costs associated with responding to and mitigating legitimate customer accounts that have been compromised by attackers, using automation.”

By turning to a layered approach with fine-grained automation detection, advanced device intelligence, behavioral analytics and passive biometrics capabilities, organizations can create a strong safety net to detect and mitigate the majority of automated interactions an organization might encounter.

Avatar photo

Sue Poremba

Sue Poremba is freelance writer based in central Pennsylvania. She's been writing about cybersecurity and technology trends since 2008.

sue-poremba has 271 posts and counting.See all posts by sue-poremba

Secure Guardrails