SBN

Taming the Vast Sea of Data: Commentary on CISA’s Strategy for 2021

Executives are very good at making decisions based upon risk, but cyber risk is still not clearly communicated in basic terms. This is a legacy issue in cyber, and much of what we build at CyberSaint seeks to address this problem. Evaluating outcomes is a complex, data-driven process, and we have been fortunate in that some of our larger customers have been helping us drive innovation on this score across very large use cases that require risk metrics across thousands of assets so that decision can be made around risk reduction at the top. 

I was heartened to see a recent post from Bob Kolasky, the CISA Assistant Director for the National Risk Management Center regarding priorities for 2021 at the Cybersecurity and Infrastructure Security Agency (CISA) and the prioritization of risk-based thinking in the United States’ national cybersecurity strategy. I agree with Assistant Director Kolasky on his framing of the problem. Understanding cyber risk necessitates an “evolved approach,” as he says.

Historically, a big issue has been getting the surfeit or what he calls the vast sea of data tamed and into contexts that allow for quick risk-based decisions by individuals who look at risk in terms of dollars, national security, or repetitional issues. 

We have built NLP to help tame the firehose of vulnerability (and other device and application) data, which is a part of the bottom-up analysis, and we have linked that, using an RMF (risk management framework) based approach at the top, by automating controls so that the emphasis can shift from a red team mentality into a more proactive, preventative stance. 

Really, it is about getting information out of a Babel-like state into a clear, risk-based regime that translates into risk and into dollars, a very actionable and traditional metric, to use the assistant director’s apt phrasing. 

One cannot walk telemetry or vulnerability data into a Board meeting, really. That would be like letting the Matrix into the room; curtains of green numbers that do not add up to anything. The data must be tamed first with an intelligent use of AI and NLP. Then that data must be associated with established risk metrics. This alignment is what the director is getting at by saying that “currently there is currently no ‘engine’ to capture all these data layers in a dynamic analytic tool.” That is the solution we at CyberSaint have been building, the engine that Asst. Director Kolasky is referring to, in cooperation with our partners in the Federal space and in private industry.

There is a proliferation of what one might call micro risk within applications. While tactical teams are sometimes able to effectively manage these micro risks, if it occurs it is on an ala carte basis with little insight or ability to report up and learn from these events. What needs to be understood is macro risk: strategic and business process risk or, as the director says, national risk. So risk must also be aggregated and standardized. It is a challenge, but our largest customers are currently helping us solve these fundamental issues.

The late Peter Drucker said, “what’s measured gets managed.” There are fewer unknowns than ever before, so even the hard-to-measure is coming into focus with automation. The trick is to get these insights translated across different linguistic regimes. 

To that end, we introduced solution cost modeling into our platform to allow organizations to game out tools, processes, labor decisions to mitigate risk based on current data, not stale data, and based on the right data. This more proactive approach, really hitting the first three functions of the Cybersecurity Framework, will radically improve cyber resiliency across the board in both private and public organizations.

Executives are very good at making decisions based upon risk, but cyber risk is still not clearly communicated in basic terms. This is a legacy issue in cyber, and much of what we build at CyberSaint seeks to address this problem. Evaluating outcomes is a complex, data-driven process, and we have been fortunate in that some of our larger customers have been helping us drive innovation on this score across very large use cases that require risk metrics across thousands of assets so that decision can be made around risk reduction at the top. 

I was heartened to see a recent post from Bob Kolasky, the CISA Assistant Director for the National Risk Management Center regarding priorities for 2021 at the Cybersecurity and Infrastructure Security Agency (CISA) and the prioritization of risk-based thinking in the United States’ national cybersecurity strategy. I agree with Assistant Director Kolasky on his framing of the problem. Understanding cyber risk necessitates an “evolved approach,” as he says.

Historically, a big issue has been getting the surfeit or what he calls the vast sea of data tamed and into contexts that allow for quick risk-based decisions by individuals who look at risk in terms of dollars, national security, or repetitional issues. 

We have built NLP to help tame the firehose of vulnerability (and other device and application) data, which is a part of the bottom-up analysis, and we have linked that, using an RMF (risk management framework) based approach at the top, by automating controls so that the emphasis can shift from a red team mentality into a more proactive, preventative stance. 

Really, it is about getting information out of a Babel-like state into a clear, risk-based regime that translates into risk and into dollars, a very actionable and traditional metric, to use the assistant director’s apt phrasing. 

One cannot walk telemetry or vulnerability data into a Board meeting, really. That would be like letting the Matrix into the room; curtains of green numbers that do not add up to anything. The data must be tamed first with an intelligent use of AI and NLP. Then that data must be associated with established risk metrics. This alignment is what the director is getting at by saying that “currently there is currently no ‘engine’ to capture all these data layers in a dynamic analytic tool.” That is the solution we at CyberSaint have been building, the engine that Asst. Director Kolasky is referring to, in cooperation with our partners in the Federal space and in private industry.

There is a proliferation of what one might call micro risk within applications. While tactical teams are sometimes able to effectively manage these micro risks, if it occurs it is on an ala carte basis with little insight or ability to report up and learn from these events. What needs to be understood is macro risk: strategic and business process risk or, as the director says, national risk. So risk must also be aggregated and standardized. It is a challenge, but our largest customers are currently helping us solve these fundamental issues.

The late Peter Drucker said, “what’s measured gets managed.” There are fewer unknowns than ever before, so even the hard-to-measure is coming into focus with automation. The trick is to get these insights translated across different linguistic regimes. 

To that end, we introduced solution cost modeling into our platform to allow organizations to game out tools, processes, labor decisions to mitigate risk based on current data, not stale data, and based on the right data. This more proactive approach, really hitting the first three functions of the Cybersecurity Framework, will radically improve cyber resiliency across the board in both private and public organizations.

*** This is a Security Bloggers Network syndicated blog from CyberSaint Blog authored by Padraic O'Reilly. Read the original post at: https://www.cybersaint.io/blog/cisa-commentary-2021