SBN

NVD delays highlight vulnerability management woes: Put malware first

malware-binary-nvd-woesA decision by the National Institute of Standards and Technology (NIST) to change how it maintains the widely used National Vulnerability Database (NVD) has focused attention on the brittle nature of current enterprise vulnerability management processes.

On February 13, NIST announced, seemingly out of the blue, what amounted to a scaling back of its vulnerability analysis contributions to the database — and its related plan to hand off that task to a consortium of other organizations. At the time, NIST offered no explanation for its decision. NIST warned those using the NVD to expect “delays in analysis efforts” during the transition and apologized for any resulting inconvenience.

The announcement jolted industry stakeholders. Since 2005, NIST has played a key role in providing analysis of and context for new vulnerabilities in the form of Common Weakness Enumeration (CWE) tags, Common Platform Enumerators (CPEs), and the Common Vulnerability Scoring System (CVSS). CWEs associate a vulnerability with the underlying software- or architecture-level weakness that caused it. It provides a standard taxonomy for classifying software weaknesses. CPEs identify specific software, systems, and versions that a vulnerability might affect so users know what they need to address and remediate. CVSS scores provide a measure of a vulnerability’s severity on system confidentiality, integrity, and availability and help security teams prioritize remediation efforts.

NIST’s abrupt slowdown resulted in a fast-growing list of vulnerabilities (CVEs) in the NVD without the crucial CWE, CPE, and other metadata security teams and tools have long relied on. The NVD dashboard shows that, as of April 18, NIST had analyzed just 99 of the 2,098 new vulnerabilities in the NVD since the beginning of April. In March, NIST provided CWE, CPE, and associated data for only 199 of 3,370 CVEs that the NVD received — or less than 6% of the total.

This lack of analysis for vulnerabilities has caused considerable concern over potential disruptions to long-established vulnerability management workflows at enterprise organizations. There’s worry about security teams being able to properly determine vulnerability risks and prioritize remediation efforts. Some are even calling it a death knell for the NVD.

Here’s what the NVD slowdown means in practical terms for vulnerability management — and why you should change your approach and go beyond vulnerabilities to focus on malware and tampering. 

[ Special Report: The State of Software Supply Chain Security (SSCS) 2024 | Download Report: State of SSCS ]

A significant challenge for enterprise security

Callie Guenther, senior manager of cyberthreat research at Critical Start, said the recent cessation of analytical data additions to the NVD poses significant challenges for enterprise security teams and vendors.

“Enterprises might face operational challenges, as automated tools relying on NVD data become less effective, necessitating alternative data sources or manual processes.”
Callie Guenther

Guenther noted that compliance with regulatory standards that mandate comprehensive NVD vulnerability data usage may also become more challenging.

Why did NIST slow down the NVD?

Jason Soroko, senior vice president of product at Sectigo, said NIST’s decision likely had to do with its analysts being overwhelmed.

“The problem is scale. NIST is going to open up the program to a consortium of vetted organizations from the industry in order to deal with the backlog of vulnerabilities that need to be analyzed and understood before being put into the NVD database.”
Jason Soroko

NIST itself updated its original cryptic notice to say that the reason it was working on assembling a consortium was because of a “growing backlog of vulnerabilities in NVD.”

This is based on a variety of factors, including an increase in software and, therefore, vulnerabilities, as well as a change in interagency support.”
—NIST

The standards body said that it is currently focusing on providing analysis around the most significant vulnerabilities and on finding more analysts to contribute to the effort. “We are also looking into longer-term solutions to this challenge, including the establishment of a consortium of industry, government, and other stakeholder organizations that can collaborate on research to improve the NVD,” NIST said.

Go beyond vulnerability management and remediation

Josh Bressers, vice president of security at Anchore, said the disruption that the NVD slowdown has caused is instructional.

The biggest takeaway is how fragile the vulnerability ecosystems are.”
Josh Bressers

There has been growth of more than 20% in the number of vulnerabilities assigned in the NVD each year, and vulnerability teams are not able to keep pace, Bressers noted. Between 2022 and 2023 alone, the number of vulnerabilities (CVEs) in NVD shot up from 25,081 to 28,831, Intel471 has tracked.

“Vulnerability teams are struggling to keep up with this load. I’m hopeful the NVD activity will be used as an opportunity to rethink how we deal with this data and how we can improve our processes.”
Josh Bressers

Bressers said organizations that are required to comply with the federal government’s FedRAMP program will likely feel the biggest short-term impact from the NVD slowdown. That’s because FedRAMP requires such organizations to use only NVD data when remediating vulnerabilities. “By definition, only NVD can supply an NVD severity rating,” he noted.

But the challenges don’t end there, Bressers said. Many organizations and vendors have relied on NVD data because of how comprehensive it has been. With NIST pulling back from the effort, a gap has opened that cannot be easily filled, he said.

Can an open-source model for the NVD help?

Anchore launched an open-source repository, called NVD Data Overrides, that it hopes will fill that gap. “This repo is meant to provide additional data that is currently missing from NVD,” the project’s page on GitHub noted.

The project is open to all on GitHub, Bressers said, who added that the effort is still in its very early stages. “Anyone is free to use the data we are generating with no restrictions. Long term, we hope to see more open-source methods applied to vulnerability data.”

Critical Start’s Guenther said organizations may need to consider alternative, third-party databases, collaborate on open-source projects such as Anchore’s, or enhance internal risk assessments for the context they need for CVEs and how to remediate them.

Is the NVD an outdated approach?

Organizations may also want to look beyond purely reactive measures such as vulnerability mitigation and consider more proactive approaches such as threat hunting and penetration testing, Guenther said.

“This situation underscores the importance of adaptability and collaboration in the cybersecurity community, highlighting the need for potential long-term shifts in how vulnerability information is managed and distributed.”
—Callie Guenther

Regardless of the NVD slowdown, teams are faced with more alerts than they can handle. SecurityScorecard and the Cyentia Institute estimate that organizations fix only 10% of the vulnerabilities in their software each month. That’s not a good outcome for software security — nor for overworked application security and security operations teams.

The most-used tool for assessing software risk is the CVSS, which offers a score to assess the risks that specific vulnerabilities pose to an organization. However, a number of criticisms have been leveled at the system, including that it is complex and difficult to understand, not accurate, and widely misunderstood.

The new Exploit Prediction Scoring System (EPSS) aims to add more value to software risk scoring, combining descriptive information about vulnerabilities with evidence of actual exploitation in the wild.

Jeremy Long, a principal engineer at ServiceNow and founder and project lead of the OWASP Dependency Check Program, said at Black Hat last year that if organizations want to properly defend against today’s software supply chain attacks, they will have to adopt tooling and measures that detect and mitigate malicious threats

He recommended modern tooling that uses binary validation, which can detect threats such as malicious build-time dependencies. This type of protocol can provide a comparison of build versions, showcasing anomalies that traditional testing misses and that further analysis may deem malicious, Long said.

Modern threats require modern tools

While the NVD is still useful, it’s not equal to the challenge of managing the risk from the rise of software supply chain attacks. Rather than focusing on remediation of vulnerabilities, to manage modern risk teams need to shift their focus to active malware — and modern attack techniques such as software tampering.

A recent ReversingLabs report, “Flying Blind: Firms Struggle to Detect Software Supply Chain Attacks,” found that software organizations need to be able to detect tampering at any and all stages of development, including post-build and post-deployment. The report, based on a survey by Dimensional Research, noted that while scans for tampering were fairly common during the build process (53%) and after build but prior to deployment (43%), much lower percentages of survey respondents said they scanned code post-deployment (34%) or that they scanned individual components prior to build (33%). 

Complex binary analysis provides the much-needed final exam for complete software packages. Charlie Jones, director of product management at ReversingLabs, wrote recently in an OpenSSF blog post:

“Incidents like the 3CX hack are proof that ‘business as usual’ in the application security testing space is not sufficient. If organizations want to keep pace with an evolving threat landscape, a new approach to risk management is needed that supports modern software supply chain needs. Binary analysis helps fill this gap by providing organizations with a method for analyzing different software types in a consistent and repeatable manner.”
Charlie Jones

*** This is a Security Bloggers Network syndicated blog from ReversingLabs Blog authored by Jai Vijayan. Read the original post at: https://www.reversinglabs.com/blog/nvd-woes-put-malware-first

Jai Vijayan

Vijayan is an independent journalist and tech content creation specialist who has been covering the technology industry for more than 20 years. He writes for several publications mainly on data security and privacy. He was most recently a senior editor at Computerworld.

jai-vijayan has 35 posts and counting.See all posts by jai-vijayan