SBN

RSAC insights: Why vulnerability management absolutely must shift to a risk-assessment approach

Vulnerability management, or VM, has long been an essential, if decidedly mundane, component of network security.

Related: Log4J’s long-run risks

That’s changing — dramatically. Advanced VM tools and practices are rapidly emerging to help companies mitigate a sprawling array of security flaws spinning out of digital transformation.

I visited with Scott Kuffer, co-founder and chief operating officer of Sarasota, FL-based Nucleus Security, which is in the thick of this development. Nucleus launched in 2018 and has grown to over 50 employees. It supplies a unified vulnerability and risk management solution that automates vulnerability management processes and workflows.

We discussed why VM has become acutely problematic yet remains something that’s vital for companies to figure out how to do well, now more so than ever. For a full drill down, please give the accompanying podcast a listen. Here are the key takeaways:

Multiplying exposures

Scan and patch. Historically, this has been what VM was all about. Microsoft’s Patch Tuesday speaks to the never-ending flow of freshly discovered software flaws — and the software giant’s best efforts ease the burden of mitigating them.

Scan-and-patch systems from Tenable, Qualys, Rapid7, Checkmarx and others came along and became indispensable; enterprises could use these tools to keep software bugs and security holes patched at a tolerable level; not just flaws in Microsoft code, of course, but software from all of their suppliers, internal and external.

However, that scan-and-patch equilibrium is no more. Digital transformation has spawned a cascade of nuanced, abstract vulnerabilities – and they’re everywhere. This results from companies chasing after agile software development and cloud-centric innovations, above all else. In aggressively leveraging digital services to achieve productivity gains they’ve also exponentially multiplied security gaps across a steadily expanding network attack surface.

“There are configuration weaknesses in cloud resources, vulnerabilities that need patching,  like Log4J, and vulnerabilities in the business logic of old code,” Kuffer observes. “There are many different types of these vulnerabilities, and it’s a matter of figuring out who owns them and how to fix them quickly, and, also, which ones to fix.”

Current threat landscape

So, what exactly constitutes a vulnerability these days? As Kuffer alluded to not long into our discussion, it goes far beyond the bug fixes and security patches that Microsoft, Oracle, Adobe and every supplier of business software distributes periodically.

Kuffer

A vulnerability, simply put, is a coding weakness by which software can be manipulated in a way that was never intended. Today, these exposures lurk not just in the legacy enterprise apps – the ones that need continually patching – they’re turning up even more so in the cloud-hosted storage buckets, virtual servers and Software-as-a-Service (SaaS) subscriptions that have become the heart of IT operations.

It all starts with DevOps, under which agile software is being pushed out based on the principle of continuous integration and continuous delivery (CI/CD.) Much heralded, CI/CD is a set of principles said to result in the delivery new software frequently and reliably.

Truthfully, CI/CD really is nothing more than an updated version rushing shrink-wrapped boxes of new apps to store shelves. Remember when early adopters were giddy to receive the bug-riddled version 1.0 of a cool new app, anticipating that major bugs would get fixed in 1.1, 1.2 etc.?

Under CI/CD, developers collaborate remotely to press new code into live service as fast as possible and count on making iterative fixes on the fly.

This fail-fast imperative often leverages cloud storage and virtual servers; code development revolves around interconnecting modular microservices and software containers scattered all across public and private clouds.

To malicious hackers this translates into a candy store of fresh vulnerabilities. In many ways it’s easier than ever for threat actors to get deep access, steal data, spread ransomware, disrupt infrastructure and attain long run unauthorized access.

Unified solution

All of that said, it’s not so much the agile software trend, in and of itself, that’s to blame. Security gaps generally — and vulnerabilities specifically — have surpassed the tolerable level in large part because companies have not paid nearly enough attention to configuring their public cloud and hybrid cloud IT systems. In short, software interconnections are skewed toward agility.

Fine tuning is in order and there’s really no mystery how to go about dialing in the necessary measure of security. Robust data security frameworks have been painstakingly assembled and vetted by the National Institute of Standards and Technology (NIST.) However, adhering to NIST 800-53 and NIST 800-171 is voluntary and, for whatever reasons, far too many enterprises have yet to fully embrace robust data security best practices.

To illustrate, Kuffer pointed me to the all-too-common scenario where a company goes live with an AWS root account that uses a default password to access all of it its EC2 virtual servers and S3 storage buckets. “That misconfiguration is a vulnerability because anybody who finds that password and then logs in to your AWS account has full admin control over your entire cloud infrastructure,” he says.

This kind of thing can be rectified by adopting risk-assessment principles alongside CD/CI. And the good news, Kuffer says, is that the cybersecurity industry is driving towards helping companies get better at systematically identifying, analyzing and controlling vulnerabilities. Nucleus Security refers to this as a shift towards “.”

Risk-tolerance security

VM done from a risk-assessment lens boils down to enterprises making a concerted effort to discover and thoughtfully inventory all of the coding flaws and misconfigurations inhabiting their increasingly cloud-centric networks, and then doing triage based on risk-tolerance principles.

This absolutely can be done, ironically, because cybersecurity vendors themselves are innovating off the strengths of cloud resources and agile software. At RSA Conference 2022 opening next week in San Francisco, there will be considerable buzz around new tools and frameworks that empower companies to discover and inventory software bugs and misconfigurations, cost-effectively and at scale.

This includes a host of leading-edge technologies supporting emerging frameworks such cyber asset attack surface management (CAASM,) cloud security posture management (CSPM,) application security posture management (ASPM,) and even software-as-a-service security posture management (SSPM.)

Specialized analytics platforms — like those from Nucleus Security and other suppliers of advanced VM technologies – fit in by enabling companies to ingest security posture snapshots from all quarters, Kuffer says. Advanced VM systems are designed to efficiently implement and enforce wise policy, without unduly disrupting agility, he says.

Clearly, the frameworks and technology are ready for prime time. If the continuing ransomware scourge and widening supply chain hacks tell us anything, it’s that it’s high time for companies to dial back on agility and dial in more security. I’ll keep watch and keep reporting.

Acohido

Pulitzer Prize-winning business journalist Byron V. Acohido is dedicated to fostering public awareness about how to make the Internet as private and secure as it ought to be.


(LW provides consulting services to the vendors we cover.)

*** This is a Security Bloggers Network syndicated blog from The Last Watchdog authored by bacohido. Read the original post at: https://www.lastwatchdog.com/rsac-insights-why-vulnerability-management-absolutely-must-shift-to-a-risk-assessment-approach/