Behavior Analysis: Getting an Inside Track on Insider Threats
Understanding human behavior can help organizations better identify and root out malicious insiders
Whether it’s a former student coming onto a college campus and destroying the school computers with a malicious USB or some other presumably trusted individual who intends to cause harm to the business, organizations are increasingly concerned about insider threats.
But what, or who, are organizations really worried about? The very definition of insider threat is ambiguous at best. Are we talking about nefarious actors who are engaging in criminal behavior for financial gain? A disgruntled employee? Perhaps an employee who has intentions of moving on is trying to exfiltrate sensitive data for her own gain? Or what about the poor chap who accidentally clicks on a phishing link?
According to a Gururcul survey conducted at this year’s RSA Conference, malicious insiders are the biggest concern for large enterprises and midsize companies, and user error was reportedly the most detrimental insider threat among the 671 survey participants.
In a recent blog post, Lance Spitzner, director of SANS security awareness, wrote, “When discussing the Insider Threat with your organization or simply, your executives, you should never feel bad about taking a step back and asking people what the term ‘Insider Threat’ means to them.”
Once you have a working definition of the different types of insider threats to your organization, you can effectively start to look for the red flags that are most often overlooked.
Catch Me If You Can
While there are behaviors that should be on any organization’s must-watch list, many companies don’t have a member of their security team that is skilled at identifying the root causes of human behavior that are sometimes better indicators of intention than the patterns established in behavior analytics.
Here’s an example shared by Eric Lackey, principal adviser, insider threat at Flashpoint. “A customer is suing a former executive who was informing a competitor. The competitor started up their own solar power company. The company goes back and does a forensic investigation of the senior executive’s computer and discovers he was sending sensitive data to his personal email,” Lackey explained. The culprit had also created PowerPoint presentations in the competitor’s name.
“That’s just so typical of what I see companies missing. It’s really low-hanging fruit. So many of the data thefts occur with employees downloading data and uploading it to their personal cloud storage or USB or sending it out via email. It’s not a matter of a file or two,” Lackey said.
Companies are missing these really large data thefts of hundreds of files that they only see post-mortem because the behavior went undetected, often for months.
The Red Flags Often Overlooked
If companies are monitoring user behavior, why are they missing hundreds of files being exfiltrated by a malicious insider?
Organizations are not looking at the right behaviors, Lackey said. The problem is a combination of issues. “Insider threat programs are the science of where human behavior and the cyber realm intersect, but the goal is to get to a point where you can predict malicious behavior.”
That’s why organizations need to rethink what behaviors are on their must watch lists, including activity on professional networking and social media sites as well as access anomalies and trends. “A lot of organizations focus on user behavior analytics tools or user activity monitoring tools and endpoint solutions, but they aren’t necessarily applying the human effort to their detection efforts,” he noted.
A Human Solution to the Problem of Human Behavior
Having people who are trained in identifying risky behaviors, such as associating somebody’s behavior of exfiltrating data with the fact that they’ve been looking for a new job for the last three months, can tie together all the pieces of detection data from their tools to look at the whole story of a person’s behavior.
“Many companies rely on data loss prevention (DLP) solutions, for example, and DLP is great as a tool, but it’s also become a bit antiquated,” Lackey said. Identifying keywords and file names is great, but many insider threat programs aren’t taking into account that the analyst looking at the screen isn’t seeing that the person has sent hundreds of documents. An alert in response to a keyword may or may not get escalated as a threat because they aren’t looking at the full scope of what the person is doing.
To see the big picture and have a successful insider threat program, organizations need to identify their staff who may already have the skills that allow them to look at and analyze human behavior.
“Train. Train. Train,” Lackey said. “There are people out there that have different skill sets, and we really need to come to the table to try and combine those skill sets—people who have come from a counterintelligence background or people who have come from auditing or fraud teams, where you can start to work together and learn from the other key indicators.”