The Biden Administration’s Impending Executive Order on Software Security
Executive branch action on software security is sorely needed, but should focus primarily on risk management rather than merely new reporting requirements.
Published by The Lawfare Institute
in Cooperation With
Last year’s revelation of the infiltration of federal agency digital supply chains—via the information technology (IT) contractor SolarWinds—revealed gaping holes in America’s cyber defenses. The White House recently attributed this intrusion to the Russian foreign intelligence service, further highlighting the sophisticated nature of malicious cyber actors targeting the United States. Following closely on this news was the announcement by Microsoft that probable Chinese government hackers had exploited previously unknown attack vectors in one of its products. The Biden administration has begun responding to these and other high-profile exploitations of vulnerabilities in commercially available software—including some used by the United States Government—through a variety of means.
Although any retaliatory actions that the United States takes against the perpetrators of these digital espionage campaigns are worthy of their own analysis, preventing future such infiltrations in the first place is of vital concern. Toward this end, the White House has signaled its intent to release an executive order on software security. While the exact text of the order is not yet public, both media reporting and public statements by administration officials have highlighted what will likely be the key components.
In this post I describe what the order might look like—based on information that is currently publicly available—and also comment on the merits of its various aspects. From my analysis of the publicly available information, it appears likely the order will drive action in three specific domains: improvements to internal federal department and agency operations, mandatory secure development standards for contractors selling software to the government, and requirements for these organizations to report data breaches proactively and cooperate with investigations into them.
The first category is where the executive order can have the fastest impacts because establishing and refining internal government procedures is likely the simplest place to start, bureaucratic inertia and resistance notwithstanding. Generally, the steps in this area being publicly mulled are welcome. But, regarding the latter two categories—which relate to the government’s interaction with software vendors—I have some concerns. While laying out these reservations, I will suggest ways in which the executive order can help improve the security of federal networks without triggering unintended consequences. In general, the order should direct the federal government to focus on managing all relevant risks while avoiding a “box-checking” focus on compliance.
With respect to federal department and agency IT operations, a Reuters article reporting on an early draft suggests that the impending order will mandate more extensive use of encryption and multifactor authentication. Although additional detailed guidance will be vital for implementing these requirements, on their face they appear appropriate. Use of the former technology still appears to be uneven throughout the government’s computers, and the latter can help stop even nation-state hackers from reusing stolen credentials to move laterally through systems.
In addition to requiring the use of technical controls, however, organizational and policy changes are also needed, and the new order is an excellent vehicle to implement them. Although nothing in the public record suggests this is imminent, modifying the Obama administration-era Presidential Policy Directive (PPD) 41 would be a critical step toward improving the government’s ability to respond to malicious activity in the digital domain.
Specifically, the new order should clarify the definition of a “cyber incident,” as PPD-41 currently conflates vulnerabilities—potential infiltration vectors—with imminent or actual exploitations of them. Federal IT teams likely detect potential vulnerabilities of varying severity in their systems every day. The vast majority are extremely difficult to use maliciously or are exploitable only in a limited set of situations. In my assessment, truly serious vulnerabilities might warrant rapid and broad notification, but the mere discovery of one should not necessarily trigger action at the level of the National Security Council.
Exploitations, by contrast, represent the successful use by an attacker of one or more such vulnerabilities. Unless conducted by an authorized party such as an ethical hacker or penetration tester, such events necessarily indicate hostile intent and are generally cause for far greater concern than the identification of a vulnerability alone.
On this note, federal departments and agencies need detailed guidance regarding damage thresholds and timelines for notification of such cyber incidents (or impending ones). For example, the order should lay out quantitative notification criteria based on the dollar value of financial loss expected or—in the worst case—actual or anticipated number of deaths or injuries. For example, a sophisticated breach of sensitive systems requiring expensive incident response and forensic measures should necessarily lead to the immediate notification of senior officials. By contrast, the prompt detection and blocking of an unskilled reconnaissance attempt can probably be reported in a weekly or monthly roll-up of malicious cyber events.
Unfortunately, the existing incident classification schema established by PPD-41 uses qualitative terminology, which the information security community increasingly views as poor practice due to its openness to interpretation. A well-designed successor regime to PPD-41 would base event triggers on numeric damage estimates. Such a clear framework could also serve as a foundation for the private-sector reporting requirements being mulled.
Finally, the Biden administration should use the issuance of the order as an opportunity to eliminate the accountability by committee that PPD-41 established (via the Cyber Response Group). The president should simply use the order to delegate coordination authority for cyber incident response to the newly established national cyber director position, as another Lawfare author has previously suggested. With such explicit authorization, and assuming he is confirmed, the recently nominated Chris Inglis can direct the relevant federal actors to take appropriate action. Similarly, the president should be wary of creating yet another organization, such as the proposed cybersecurity incident response board that the aforementioned Reuters article has suggested is under consideration. Establishing such an additional body would further cloud the already muddy waters of responsibility for information security in the federal government.
In response to the second category of potential requirements—security mandates levied on government contractors—a coalition of industry groups have already expressed concerns via a letter to the secretaries of commerce and homeland security. I think their hesitance is appropriate, based on some statements the Biden administration has already made publicly. For example, Jeff Greene, acting senior director for cybersecurity at the National Security Council, said that “we’re at the point where the federal government simply can’t bear the risk of buying insecure software anymore.”
This statement implies plans to implement a (currently undefined) standard of security for software below which the government will never consider buying it. Unfortunately, indexing only on one characteristic of a piece of software is not a good practice, either for the private sector or for the government. As I have stated before, deciding whether to accept information security risk or spend time and money mitigating it must always depend on the countervailing reward to be had by using such software. At times, keeping old and likely insecure software in operation might be the only alternative to shutting down entire systems or programs. A zero-defect mentality with respect to cybersecurity can inflict substantial costs in other domains that, upon review, leaders often decide are not justifiable or appropriate. Additionally, such a binary perspective necessarily implies a “box-checking” mentality rather than a focus on weighing risks and rewards.
To give just one example, the Reuters article suggests the order will require in certain cases that vendors provide the government with software bills of material (SBOMs), identifying all of the components contained therein. Taken by itself, this seems like a sensible requirement, as third-party components (and their dependencies) can introduce serious vulnerabilities into applications. The government, however, should be careful in what it asks for and have a plan for what happens if it gets it. SBOMs can certainly highlight known vulnerabilities in products used in federal IT systems, but a surface-level analysis of a typical SBOM will likely set off unnecessary alarm bells while potentially obscuring true threats.
Due to the nearly ubiquitous use of third-party libraries in enterprise software, most SBOMs will include references to components with dozens or even hundreds of known vulnerabilities. In many cases, deep technical analysis is required to determine if such flaws are actually exploitable. While I have advocated for the establishment of a National Supply Chain Intelligence Center to do just that—at least for software used in critical fields such as defense and intelligence—the absence of the necessary analytic power will likely lead to unwarranted panic when government or contracted engineers pore over hundreds of newly generated and delivered SBOMs.
Additionally, sophisticated adversaries are more likely to take advantage of previously unknown vulnerabilities, using zero-day exploits. The poisoning of SolarWind’s software—and subsequent infiltration of a series of federal agencies—may have been the result of the attackers’ use of vulnerabilities that were not known beforehand. Additionally, SolarWinds inadvertently digitally signed the corrupted Orion software responsible for the breach before distributing it to customers. Thus, anyone reviewing an SBOM for Orion—before the public disclosure of the relevant vulnerabilities—would probably believe everything to be in order.
This is not to say that developing standards for SBOMs is a bad idea or that the government should not require them from vendors. The use of components with known vulnerabilities remains a major cybersecurity threat. My point is that a carefully thought out strategy for processing and analyzing SBOMs must be a prerequisite before any blanket mandate comes into force. Establishing triage and remediation processes in advance to address what will likely be a tidal wave of apparent software flaws will help facilitate more effective risk-based response procedures.
More broadly, the president and his staff should carefully consider any new requirements levied on government contractors. Even well-intentioned and facially proper mandates could potentially be counterproductive—or simply impossible to comply with—if not thoroughly reviewed. Furthermore, no measure by itself is a panacea but rather should form part of a comprehensive defense-in-depth cybersecurity strategy.
The third category of new requirements likely to be included in the order—based on my analysis—relates to private-sector obligations following a cyber incident. Most importantly, federal government vendors will probably need to report identified data breaches proactively, according to the Reuters article. Additionally, the article suggests the new order may mandate that such victim organizations cooperate with the Federal Bureau of Investigation and Cybersecurity and Infrastructure Security Agency when the latter organizations investigate such incidents. Including these requirements as part of future purchase agreements between the government and software providers seems to be a fair requirement, and such contractual obligations between private-sector companies are already common.
With that said, creating a clear standard for what level of cooperation will be necessary and what exactly warrants notification will be vital. In the former case, the order should mandate that contracts explicitly spell out data retention and preservation obligations of government suppliers, as well as providing confidentiality guarantees and liability waivers for companies that make good-faith efforts to meet them. In the latter case, not every malicious network activity necessarily warrants government notification, and some observers have already noted that this requirement could cause alert fatigue. The order should thus establish clear thresholds for reporting—mapped directly to the successor regime to PPD-41—to avoid creating yet another standard.
On the note of duplication, the Securities and Exchange Commission (SEC) requires publicly traded companies (including many government contractors) to file formal public disclosures about “material” cybersecurity incidents. This standard is still somewhat unclear, leading to broad variations in reporting, and may be different from the bar set by the new executive order. The order should thus direct the SEC to define “materiality” in terms of the precise thresholds spelled out by the PPD-41 successor regime. Thus, a publicly traded firm selling software to the government that suffers a breach would have to expend fewer resources determining—and complying with—its various reporting obligations. Doing so will help to reduce the patchwork of requirements with which heavily regulated companies must contend, allowing them to focus more on cybersecurity risk management than on redundant compliance-related tasks.
Make no mistake: clear direction from the president on software security is a pressing need. Executive action is a good first step toward setting the tone across the entire federal government and ensuring unity of effort. Preventing the establishment of perverse incentives when it comes to cybersecurity is equally important. Apparently in response to the aforementioned private-sector concerns, the White House wisely appears to be considering a period in which the requirements would be nonbinding, allowing for feedback from the industry and academia. Taking into account the views of technology experts will help tailor any such requirements appropriately and allow both the government and the private sector to focus on actual information security risk versus complying with yet another set of well-intentioned but onerous requirements.
PTC, the author’s employer and a publicly traded company, sells software to the United States Government. Microsoft is a PTC partner. The views expressed in this article, however, do not necessarily reflect the official policy or position of PTC, Microsoft, or the United States Government.