National Vulnerability Database delays cause major blind spot for software makers and users

Taylor Armerding
Nerd For Tech
Published in
7 min readApr 15, 2024

--

Modern life is filled with warnings — tight curve ahead, this plastic packaging can cut you badly, your tire pressure is low, turn the breaker off before you start messing with a faulty outlet, your McDonald’s coffee is served very hot. Etc. That’s mostly a very good thing — maybe it ought to be obvious that the coffee is hot, but whatever. Warnings generally make our lives safer.

Indeed, if they went away, life would be more dangerous. Which is what is happening in the digital world with the National Institute of Standards and Technology’s (NIST) National Vulnerability Database (NVD).

Starting in mid-February, NIST’s detailed analyses of software vulnerabilities (hundreds of new ones every day) in a database maintained by the MITRE Corp. called common vulnerabilities and exposures (CVEs) have declined precipitously.

The CVE catalog includes a numerical ID and a severity rating for each, but the NVD analysis provides added context — among them vulnerability types, what applications and operating systems are at risk, what damage they could allow, and how easy the vulnerability is for attackers to exploit.

Eric Chin, senior product manager at Anchore, wrote in a blog post on March 8 that “Starting February 12, thousands of CVE IDs have been published without any record of analysis by NVD. Since the start of 2024 there been a total of 6,171 total CVE IDs with only 3,625 being enriched by NVD. That leaves a gap of 2,546 (42%!) IDs.”

Or as a post in Dark Reading put it, “the critical government-sponsored database went from being an essential tool to a nearly dark destination.”

NIST, a federal agency within the U.S. Department of Commerce, had posted only a cryptic announcement in late February about “temporar[y] delays in analysis efforts.” It was a bit more forthcoming earlier this month, acknowledging in an April 2 update “a growing backlog of vulnerabilities submitted to the NVD and requiring analysis.”

And given that software pretty much runs the world, that is an ominous trend. If those who build and maintain software products don’t know that a significant vulnerability exists in a component they made, obviously they won’t know they need to fix it. and everybody who uses it won’t know it needs to be fixed, which puts them all at risk.

A problem, but how bad?

The software industry has relied on this model for 25 years, starting in 1999 when it was called Internet — Categorization of Attacks Toolkit, or ICAT. It was rebranded as the NVD in 2005.

How severe is the problem? So far it hasn’t generated anything close to the avalanche of headlines following mega-breaches like that of the credit rating giant Equifax. Nothing on the national evening news. But those who have posted about it contend that it’s very serious. Indeed, at the time of Chin’s post, he noted there had been 6,171 new CVEs posted since the start of the year. According to the NVD Dashboard, there are now more than 10,500. If the same 42% of those vulnerabilities lack analysis from NIST, that would mean around 4,400. That’s a lot of opportunity for hackers, and a lot of risk for users.

Dan Lorenc, CEO of Chainguard, wrote on LinkedIn that the problem goes well beyond NIST’s description of a temporary delay in analysis. “Scanners, analyzers, and most vulnerability tools rely on the NVD to set these fields so they can determine what software is affected by which vulnerabilities. This is a massive issue,” he wrote.

It’s important to note that the NVD production hasn’t gone completely dark. According to the April 2 update, NIST is “prioritizing analysis of the most significant vulnerabilities. In addition, we are working with our agency partners to bring on more support for analyzing vulnerabilities and have reassigned additional NIST staff to this task as well.” In other words, it’s trying to keep up with the worst of the worst.

And the Dark Reading post noted that the NVD isn’t the only game in town — some security vendors have created their own vulnerability databases, along with “several open source efforts that have been underway for years but have lately gotten more attention, thanks to the NVD freeze.”

But William Cox, software engineering architect with the Synopsys Software Integrity Group who also sits on the CVE Board run by MITRE, said most of the private databases are curated for their customers and market segment. “Where a consumer’s interests overlap with the direction of these sources, then you’re in luck,” he said. “But most consumers will not be fully satisfied with any singular source of data. Also, having private/ad hoc databases without a standard identification mechanism only creates confusion for the security community, which is precisely what the CVE program is meant to resolve.”

Cox added that “the broader CVE community is going to take a hit with this slowdown and the decreased awareness of recent CVEs. The effect is small right now, but each passing day compounds the problem, much like the frog in boiling water. Parts of the industry won’t realize there’s a problem until it’s gone on for a while, and by then the solution won’t be available quickly enough to alleviate the pressure.”

There have been complaints for years about the NVD lagging behind the pace of reported vulnerabilities. Cox said the NVD takes an average of 5 to 7 days to provide analysis of vulnerabilities after they have been posted on the CVE — plenty of time for hackers to exploit them. But the past couple of months has taken it to a much more extreme level.

A matter of trust

Beyond that, NIST posted a legal disclaimer last Sept. 11 warning users that it will accept no liability for any defects in the information it provides, including its accuracy.

“The NVD is expressly provided ‘AS IS.’ NIST MAKES NO WARRANTY OF ANY KIND, EXPRESS, IMPLIED OR STATUTORY, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, NON-INFRINGEMENT, AND DATA ACCURACY. NIST does not warrant or make any representations regarding the use of the NVD, its contents, or the results obtained therefrom, including but not limited to the correctness, accuracy, reliability, or usefulness of the NVD.” The disclaimer added, “You are solely responsible for determining the appropriateness of your use of the NVD and its contents and you assume all risks associated with its use.”

What is causing both the lack of data and a warning not to trust the data? According to NIST, it’s money and money — it needs more from both government and the private sector. The announcement about the backlog said it was due to more software being created (therefore more vulnerabilities) “as well as a change in interagency support.”

Indeed, the number of CVEs is increasing steadily and significantly. In the five years between 2018 and 2023 the catalog went from 16,512 to 28,961, up more than 75%. And money, along with the chronic lag time, has also always been a problem — numerous experts say the NVD has suffered from a perennial lack of funding.

Cox, who attended the recent VulnCon 2024 in Raleigh, N.C., said Tanya Brewer, NVD program director, in a separate meeting with the CVE board, pointed out that the NVD operates on an “underfunded mandate.” As in, legislation directs NIST’s work, but doesn’t provide enough money to accomplish the mission.

Go public/private?

What to do? The NIST announcement includes this: “We are also looking into longer-term solutions to this challenge, including the establishment of a consortium of industry, government, and other stakeholder organizations that can collaborate on research to improve the NVD.”

There is considerable discussion about that among those who have posted about the backlog, some of it skeptical. “I’ve encountered many problems in my career, but I’ve never seen one where ‘a consortium’ actually helped address them,” Lorenc wrote in his LinkedIn post.

But Jason Soroko, senior vice president of product at Sectigo told Dark Reading that “getting additional analysts working through the backlog is critical,” so the move to assemble a vetted consortium “is a good one.”

Cox is dubious about it, citing potential bias as well as legal issues. Talk of privatization isn’t new, he said, but “among the reasons CVE is operated in the public sector is to attempt to remove commercial bias in reporting. It’s well-known that some commercial entities will under-report and/or under-score vulnerabilities reported on their own products.”

He cited a comment from an attendee at VulnCon 2024 who paraphrased the legendary British Prime Minister Winston Churchill: “CVE/NVD is the worst vulnerability management system, except for all the others.”

Finally, any vendors that want to sell software products to the government are required to use the CVE. “Government mandates its own reliance on CVE; it’s literally written into the law,” Cox said. “Any entity wishing to do business with the government must also work with CVE., e.g., FedRAMP [Federal Risk and Authorization Management Program.].”

Lorenc noted the bureaucratic inconsistency of that. “It feels like NIST is somehow trying to wind this program down or hand it off, while other areas of the government are forcing its adoption,” he wrote.

For the moment, the consortium issue is just in the works, meaning it is not a short-term solution. “It’s going to take months to stand it up and deliver any benefits,” Cox said.” One can only hope that NVD isn’t so distracted by it that their primary mission suffers systemically.”

Meanwhile, that leaves users of the NVD with a major blind spot. Which should be a reminder to a government that has laudably focused at least its rhetoric on better cybersecurity, that if it intends for its mandates to be fulfilled, it has to fully fund them.

--

--

Taylor Armerding
Nerd For Tech

I’m a security advocate at the Synopsys Software Integrity Group. I write mainly about software security, data security and privacy.