Cybersecurity & Tech

Making Attestation Work for Software Security

Jim Dempsey, Steven B. Lipner, James Andrew Lewis
Thursday, July 18, 2024, 12:00 PM
Attestation will be part of the federal government’s software procurement process for the foreseeable future. Let’s make it work.
Web code (https://www.rawpixel.com/image/5957635/free-public-domain-cc0-photo, CC0)

Published by The Lawfare Institute
in Cooperation With
Brookings

On June 11, a process went into effect requiring companies supplying “critical software” to the federal government to attest that they developed their products in accordance with certain software security practices, such as controlling access to their development environments and using tools that analyze source code for vulnerabilities. For other software, the requirement kicks in on Sept. 11. Failure to do what they attested to will expose vendors to a variety of sanctions, up to and including liability under the False Claims Act.

Requiring that software developers adhere to secure software development practices is one component of the Biden administration’s strategy for improving the nation’s cybersecurity. At least one of us believes that a focus on the process of producing software will not be sufficient by itself and that other forms of legal liability will be needed. But all three of us agree that attestation to some set of secure development practices has a useful role to play. And for the foreseeable future, it will be part of the software procurement process of a major customer of software companies: the federal government. Let’s make it work.

The federal government, in all of its functions, is deeply dependent on software from commercial developers. Improving the security of that software is foundational to cybersecurity and the delivery of critical government services. Much of that software is general purpose and is equally existential to private-sector operations, so using the government’s purchasing power to improve its security should benefit the entire society.

Requiring Adherence to Secure Software Development Practices

The current attestation requirement comes from Executive Order 14028, issued by President Biden in 2021. That order directed the National Institute of Standards and Technology (NIST) to issue guidelines identifying practices that enhance the security of software development. This was based on the premise that there are certain design, development, and coding practices that would reduce the prevalence of software vulnerabilities—practices such as employing automated tools that check for known and potential vulnerabilities. Using its market power, if the government can insist that its software suppliers follow these practices, that will reduce the number of flaws in the products it uses. And since companies prefer to make one product for all customers rather than one specifically for government use, the government can be a market driver, and that in turn could have broader benefits.

NIST was already on the case by the time the Biden administration issued its order. In February 2022, it issued an updated version of its Secure Software Development Framework (SSDF). The Office of Management and Budget (OMB) then issued and later updated a memorandum requiring executive branch agencies, with some exceptions, to use software only if the producer of that software has first attested to compliance with practices drawn from the NIST framework. The baton then passed to the Cybersecurity and Infrastructure Security Agency (CISA), within the Department of Homeland Security. Earlier this year, CISA and OMB issued a software development attestation form, listing the secure software development practices that vendors to the government must attest to.

Curiously, the CISA form calls out only some elements of the NIST SSDF, such as regularly logging, monitoring, and auditing trust relationships for authorization and access to software as it is being developed, enforcing multi-factor authentication for developers, and using a software bill of materials (SBOM) to track provenance of third-party components incorporated into their products. Other practices, which seem vitally important, such as using threat models to analyze the security of software designs and applying root cause analysis of discovered vulnerabilities to drive continuous improvement of the developer’s process, were left off the list. The attestation form is also riddled with caveats not found in the SSDF. For example: credentials need be encrypted only “to the extent feasible”; developers need only make a “good-faith effort” to maintain trusted source code supply chains; and SBOMs must be maintained “to the greatest extent feasible.”

Even with this flexibility, one recent survey of security professionals found that only 20 percent of companies were prepared to meet the deadline for federal cybersecurity attestation. Only 16 percent of respondents said their companies were using SBOMs.

Moreover, OMB has allowed agencies to continue using software from vendors who cannot attest to the minimum practices so long as they submit a Plan of Action & Milestones (POA&M). The word “Milestones” suggests that a developer’s plan must specify a deadline for coming into compliance, but the OMB memos don’t require one. Instead, the OMB memos require developers to identify the practices to which they cannot attest and to document practices they have in place to mitigate associated risks; they do not actually require that developers ever come up to speed. This is a significant omission.

Making Attestation Work

The attestation forms that should now be rolling in should be viewed as just the first iteration of a permanent feature of government procurement. Undoubtedly, lessons will be learned from the first round and adjustments will be made. Even at this early stage, some features of the program jump out as requiring improvement. If OMB, CISA, and NIST diligently assess compliance with the program, additional enhancements will likely emerge. In the meantime, here are some initial suggestions.

Extend the Attestation Program to the Entire SSDF, Without Added Caveats

The Biden order that got the ball rolling, Executive Order 14028, listed specific elements that the framework “shall include,” but the list was clearly not intended to be exclusive. How could it be, after all, when the president called on NIST’s expertise to “identify[] practices that enhance the security of the software supply chain”? And OMB, in ordering agencies to obtain compliance with the SSDF, did not suggest that developers should comply with only select elements of the framework. (OMB’s 2022 memo, after noting that the executive order directed NIST to issue guidance “identifying practices that enhance the security of the software supply chain,” says that the order “further directs the Office of Management and Budget (OMB) to require agencies to comply with such guidelines.”) But when it came time for CISA to produce the actual attestation form, the agency thought it necessary to rely on the original executive order and require only those practices that the president had specifically named in 2021.

A revised attestation form should require a blanket attestation to all applicable practices identified in the NIST framework and without any added caveats. (Noting that there has to be a “to the extent applicable” qualifier because, for example, you don’t do static analysis for memory-safety vulnerabilities on code in a memory-safe language.) If NIST thought the practices were important enough to include in a comprehensive development life cycle plan, then they should be required for the software procured by the government. 

Provide Transparency to Enable the Market to Assess Vendors’ Commitment to Security

Transparency creates an important degree of accountability. The attestations of software developers should be publicly available in a searchable database. There is nothing in the executive order or the OMB memos that prohibits publication of the attestations, though nothing requires it either. (On the other hand, the OMB memos make it expressly clear that any third-party assessments backing up an attestation shall not be made public. Maybe that’s justified.) Researchers and others should be able to see which companies are attesting to the security of their development practices. This could aid the effort to develop cybersecurity metrics that link specific development practices to the presence or absence of frequently exploited weaknesses. Transparency will aid the indirect goal of the executive order, which is to improve all software, by allowing private-sector users to better understand whether the software they depend on was developed in conformity with the NIST SSDF. Publicly available attestations create an incentive for producers to meet secure development standards.

OMB should also require agencies and developers to publish their POA&Ms—the documents in which they admit they fall short of secure standards and promise to get compliant in the future. Under the current OMB memos, the POA&Ms are to be kept confidential. On balance, we believe, it would be better to publish these assurances and timelines. They won’t identify specific vulnerabilities, but the added transparency of their publication will aid oversight and enforcement, as well as provide critical information to private-sector customers. 

Require Developers to “Show Their Work”

Transparency should also extend to developers’ specific practices. The “F” in SSDF stands for framework, meaning the NIST document is not actually an implementable set of practices for software coding. Instead, it is a framework of the types of practices that should be implemented. NIST expressly states that each developer must customize the framework for their own products before putting it into action. Posting actual software development practices would bring some form of accountability to software development.

For example, the attestation form specifies that developers should use automated tools or comparable processes to check for vulnerabilities. Vendors should be required to specify what tools they use and how they use them (which tool error reports are treated as “must fix,” for example). Among other benefits, this could potentially generate competition among the developers of those tools. The attestation form requires developers to maintain SBOMs. SBOMs for thousands of products are already publicly available, and SBOM public availability should be the norm for products provided to the government. National security systems may require a separate approach, although much software is used in both national security and non-national security systems.

This type of transparency could have huge value in the commercial market. For the first time ever, customers would be able to compare the practices of different vendors. Researchers and others could prepare comparison charts, showing which companies actually follow accepted practices and which don’t. 

Someone Needs to Enforce This System

Even if the system of attestations and process descriptions is fully transparent, what will be the enforcement mechanisms against companies that are not creating processes that actually adhere to the NIST SSDF or that are not following the practices they attest to? This entire process should not just create another statement that can be used to assess liability under the False Claims Act or other law after an incident occurs. The trade press, companies like Gartner, nonprofit watchdogs, academics and their graduate students, and competitors play powerful roles in the vulnerability research ecosystem. Private-sector scrutiny and academic analyses could better create compliance than sporadic enforcement actions by overwhelmed government agencies.

So far, the government has shown little capacity to enforce the cybersecurity assurances that contractors provide. The Department of Defense provides perhaps the most well-documented failure, if only because its inspector general has pursued the issue. Since 2016, every Pentagon contractor (except for commercial off-the-shelf items) has attested to implementing the 110 NIST-specified controls on the protection of unclassified information. Yet every inspector general inquiry has found contractors making the attestation without actually complying with the controls. Some other percentage of contractors admit they do not implement the controls and submit POA&Ms, but it is widely recognized that the contracting components do not assiduously follow up on whether those contractors do what they promised. Some of the most important questions under the Executive Order 14028 attestation program will be how many contractors submit POA&Ms, covering what practices, and what agencies do to follow up and hold the contractors and their subcontractors to their milestones. Agencies should be encouraged to question vendors’ practices or take procurement action where POA&Ms are submitted and never implemented, or where patterns of security vulnerabilities (publicly or privately discovered) suggest that the vendor is doing an ineffective job of following the SSDF.

Perhaps there is a role for an expanded version of the Cyber Safety Review Board, staffed by detailees from the National Security Agency, CISA, NIST, and other agencies, which could undertake a deep dive into the development practices of vendors after a major incident or in response to public reports of significant deviations from the practices required by the SSDF and attested by vendors.

Ultimately, attestation must be linked to some form of liability and penalty, such as suspension and debarment from further sales to federal customers for some period. The General Services Administration’s acquisitions regulations already allow for this for a number of different reasons. This, along with measures to increase transparency, will create incentives for compliance and for improved software production processes.

***

The government’s software attestation program is an experiment. Perhaps having software developers attest that they follow NIST’s framework of secure software development practices will not reduce the prevalence of software vulnerabilities. As Thomas Edison said, an experiment is successful even if it only proves what doesn’t work. But we want this experiment to be successful in a different way: to actually improve the security of software acquired by the government and thus also the security of software relied on by the private sector. Some enhancement to what has just begun could go a long way in that direction.


Jim Dempsey is a lecturer at the UC Berkeley Law School and a senior policy advisor at the Stanford Program on Geopolitics, Technology and Governance. From 2012-2017, he served as a member of the Privacy and Civil Liberties Oversight Board. He is the co-author of Cybersecurity Law Fundamentals (IAPP, 2024).
Steve Lipner is the executive director of SAFECode, an industry nonprofit focused on software security assurance. He was previously partner director of software security at Microsoft, where he was the creator and long-time leader of the Security Development Lifecycle (SDL). He serves as chair of the U.S. gGovernment’s Information Security and Privacy Advisory Board and has more than a half century of experience in cybersecurity as researcher, engineer, and development manager. He is a member of the National Academy of Engineering.
James Andrew Lewis is a senior vice president and program director at CSIS, where he writes on technology, security, and innovation. Before joining CSIS, he worked at the Departments of State and Commerce as a Foreign Service officer and as a member of the Senior Executive Service. His government experience includes work on a range of politico-military and Asian security issues, as a negotiator on conventional arms transfers and advanced military technology, and in developing policies for satellites, encryption, and the Internet. Lewis led the U.S. delegation to the Wassenaar Arrangement Experts Group on advanced civil and military technologies and was the rapporteur for the 2010, 2013, and 2015 UN Group of Government Experts on Information Security. He was also assigned to U.S. Southern Command for Operation Just Cause and to U.S. Central Command for Operation Desert Shield. He received his Ph.D. from the University of Chicago.

Subscribe to Lawfare