In June 2015, the U.S. National Institute of Standards and Technology (NIST) released its latest set of guidelines for the handling of Controlled Unclassified Information (CUI), comprising data including personally identifiable information (PII), banking, health information or other sensitive bits of data that one would not want falling into the wrong hands.
These new standards are now in effect for private-sector organizations, having passed the Dec. 31, 2017, deadline to prepare for compliance.
The new recommendations, referred to as NIST 800-171, were laid out in a paper titled, “Protecting Controlled Unclassified Information in Nonfederal Information Systems and Organizations,” which outlined NIST’s expectations for contractors working with the federal government and handle this kind of data.
Derived from Executive Order 13556 signed by former president Barack Obama in 2010, NIST’s stated goal in this effort is to bring the “non-federal entities”—or contractors, in human speak—that work with the government into line with some of the same standards that the feds have been working toward in recent years.
Having been hit with hacks such as the massive breach of the Office of Personnel Management in 2015, when the records of 21.5 million people—including those who were in the process of being reviewed for security clearances—were stolen, the government has been taking measures to take better care of the data being held in its systems.
From a reading of the regulations in the document, it is clear that there is a real concern over the contractors’ ability to keep the information confidential while it is in their custody as well. This covers securing the data while it is in transition as well as at rest in their databases.
In their description of the new regime of rules that they have termed the CUI Registry, the NIST authors explain that it “provides general descriptions for each [category and subcategory], identifies the basis for controls, and sets out procedures for the use of CUI, including but not limited to marking, safeguarding, transporting, disseminating, reusing, and disposing of the information.”
WhiteSource spoke with Ron Ross, a NIST Fellow who was one of the authors of the guidelines, who explained their approach in looking to strengthen protections for CUI without being a burden on contractors.
He said that, in shaping the guidelines, they took out requirements and controls that apply to areas of data security that concern availability and integrity, that only applied to the feds, or that they assumed companies were already doing. For their purposes, the focus was on issues of confidentiality.
Ross says that his team wanted to make the requirements substantial but not overwhelming, noting, “We didn’t want to push everything to the private sector that apply to the federal side.”
Keep Data Safe Through Better AppSec
While much of the requirements in the 14 areas of security relate to measures such as ensuring proper access control, authentication methods, and IT network integrity, one section stood out as critically important to those of us who eat, breathe and sleep application security.
Section 3.11.2, which describes Risk Assessment, requires that contractors, “scan for vulnerabilities in the information system and applications periodically and when new vulnerabilities affecting the system are identified.”
Not only do contractors have to perform scans, but they also are expected to remediate the vulnerabilities. This essentially moves the responsibility for security to the contractors in a message from the government that it wants to see the companies supplying services as taking ownership over their products.
The writers specifically point out that applications are a part of the information system components, removing any doubt that application security is a serious topic for those who focus solely on endpoints and network issues.
Applications are the front doors of your business and service. Simply put, they are how we interact with data held in a system and perform functions. As companies are holding more and more data—and in the case of government work this can include plenty of personal information such as Social Security numbers and other items that could be used for identity fraud or worse—they become more attractive targets for hackers.
Already, back in 2015, it was reported that 84 percent of cyberattacks were targeting the application layer as hackers sought out the soft underbelly from which to infiltrate information systems. This was partially because network security was seeing significant improvements, while protections for applications were lagging behind.
Protecting the open source components in your applications is an essential part of perimeter security, as a breach can expose your data to theft or tampering. This concept may fly in the face of industry perceptions where the proprietary code is held up as the crown jewels, receiving the most attention from security teams.
Let me explain.
Why You Should Care About Open Source Components
According to estimates by Gartner, open source comprises 60 percent to 80 percent of the code in most products. Contrast this with the 10 percent to 20 percent of proprietary code that is written in-house and you start to understand that the pyramid of priorities of where the emphasis on security is inverted.
In the case of Equifax, the company was likely unaware it had components with known vulnerabilities in its application, and therefore had no way to know it needed patching. Even more galling is that the vulnerability in the Apache Struts2 version was discovered in March 2017, the breach occurred sometime between May and July and was discovered in mid-July. The thieves had literally months to carry out their attack while Equifax was none the wiser.
One of the likely culprits behind the failure was that the company thought its static and dynamic testing tools (SAST and DAST), which are designed for checking proprietary code, would cover their open source components. But, as we know now, they were clearly insufficient for the getting the job done.
Developers depend on open source projects such as Apache Tomcat, MySQL and the Spring Framework for putting together basic functions of their apps, allowing them to work faster and focus their talents on the proprietary code that make them unique.
Like all good things, open source comes with a certain set of risks. When a vulnerability is discovered by someone in the “thousand eyes” of the open source community members—or a hacker—then all the organizations that are using the vulnerable versions of an open source project are at risk of attack, as a malicious actor can simply ping hundreds of companies to see if they are susceptible to an exploit.
Incorporating SCA into your Security Posture
Even as plenty of companies that are part of the supply chain for government services are likely scrambling to get a handle on their policies and protections, there is some good news to be had when it comes to security solutions for open source.
Whereas SAST and DAST are not up to the task of protecting your applications from risky open source components, software composition analysis (SCA) was built for it. Top-tier SCA solutions are capable of identifying the unique digital signatures of open source libraries so that you know exactly what you have in your inventory at all times, and will send you alerts when new vulnerabilities are discovered.
Creating an effective security strategy means staying ahead of the hackers who are dedicated to compromising your infrastructure. As network security has hardened in recent years, applications—and the open source components that make up the vast majority of the code that they are built on—present a soft underbelly for attackers to infiltrate your information systems.
Implementing solid application security systems that take care of your open source needs can take some of this pressure off your checklist. Dec. 31 may have come and gone, but it’s not too late to take control of your application’s security.
3 Tips for Approaching NIST 800-171 Compliance
If preparing for the new standards to take effect is making you tear your hair out, here are a few tips to help you get started and maintain your sanity.
- Figure out first if you actually hold any CUI in your servers. Will you be getting it as part of a contract with the fed or DOD? Find out if you meet the criteria to be affected by 800-171 with this helpful link to the National Archives’ CUI Registry.
- If you have CUI, where does it actually reside within your network? CUI can be spread across dozens of servers or workstations. so try to consolidate it to the smallest area possible. It’s easier to protect data in one spot rather than across the entire enterprise.
- Find a cloud service provider that is FedRAMP approved. It allows the cloud provider to make a claim about whether the products or services meet the requirements. Every agency can go into FedRAMP and see who is approved to be on their list. That evaluation from FedRAMP will save you a lot of time and money, since they have already been pre-approved.