No, the U.S. Government Should Not Disclose All Vulnerabilities in Its Possession
The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds. Proponents argue that this would allow patches to be developed, which in turn would help ensure that networks are secure. On its face, this argument might seem to make sense—but it is a gross oversimplification of the problem, one that not only would not have the desired effect but that also would be dangerous.
Published by The Lawfare Institute
in Cooperation With
The WannaCry and Petya malware, both of which are partially based on hacking tools allegedly developed by the National Security Agency, have revived calls for the U.S. government to release all vulnerabilities that it holds. Proponents argue that this would allow patches to be developed, which in turn would help ensure that networks are secure. On its face, this argument might seem to make sense—but it is a gross oversimplification of the problem, one that not only would not have the desired effect but that also would be dangerous.
Attacks like WannaCry and Petya don’t exist in a vacuum. They emerge out of a complex ecosystem whose components include software vendors, integrators, and customizers; those who buy products but fail for various reasons to upgrade; the miscreants who misuse software for illegal purposes; and those who discover vulnerabilities. Let’s look at each component individually.
Developers do their best to write secure code, but complex software almost always has some vulnerabilities. Users agree to buy the software “as is,” and most software companies attempt to patch vulnerabilities as they are discovered, unless the software has been made obsolete by the company. This was the case with Windows XP, which WannaCry exploited. When a customer buys software, particularly foundational products such as operating systems, they often integrate or customize the software to improve the efficiency of their internal processes to make them more competitive. This is powerful stuff for business, but it complicates patching, making updates more expensive and risky to business operations.
Customers who buy software should expect to have to patch it and update to new versions periodically. That means resources need to be allocated for that purpose. Some customers don’t do that. In other cases, the entanglement of operating systems with business operations via customized software means the customer can’t afford the risk of patching; then, resources must be dedicated to mitigating the risk of unpatched software in the overall architecture. Both lack of resources and operational risk appeared to contribute to the extensive impact WannaCry had on Britain's National Health Service. Finally, some “customers” illegally acquire the software and do not have the ability to patch unlicensed versions.
The actors behind WannaCry and Petya, believed by some to be from North Korea and Russia, respectively, had specific goals when they unleashed their attacks. WannaCry seemed to be straightforward but poorly executed ransomware, while Petya appeared to have a more sinister, destructive purpose, especially in the early Ukraine-based infection vector. Those actors probably would have used whatever tools were available to achieve their goals; had those specific vulnerabilities not been known, they would have used others. The primary damage caused by Petya resulted from credential theft, not an exploit.
A host of entities attempt to discover vulnerabilities in software. These include developers themselves, who work to improve their products; others looking for “bug bounties” paid by software companies; and security researchers. In these cases, the vulnerabilities are disclosed to developers so they can create a patch, hopefully before the vulnerability is discovered by someone else. Another group looking at vulnerabilities are criminals seeking to find new ways to penetrate systems and make money; they do not disclose vulnerabilities but instead use them as long as they can. The last group is government entities, either information security professionals who want to ensure the products they use or recommend to others are secure, or others who want to use the vulnerability for law enforcement, intelligence or military purposes. Most of the vulnerabilities discovered by the U.S. government are disclosed, and at the National Security Agency the percentage of vulnerabilities disclosed to relevant companies has historically been over 90 percent. This is atypical, as most world governments do not disclose the vulnerabilities they find.
WannaCry and Petya exploited flaws in software that had either been corrected or superseded, on networks that had not been patched or updated, by actors operating illegally. The idea that these problems would be solved by the U.S. government disclosing any vulnerabilities in its possession is at best naive and at worst dangerous. Such disclosure would be tantamount to unilateral disarmament in an area where the U.S. cannot afford to be unarmed. Computer network exploitation tools are used every day to protect U.S. and allied forces in war zones, to identify threats to Americans overseas, and to isolate and disrupt terrorist plots directed against our homeland and other nations. It is no exaggeration to say that giving up those capabilities would cost lives. And this is not an area in which American leadership would cause other countries to change what they do. Neither our allies nor our adversaries would give away the vulnerabilities in their possession, and our doing so would probably cause those allies to seriously question our ability to be trusted with sensitive sources and methods.
Some have suggested that the U.S. government losing control of its software tools would be tantamount to losing control of Tomahawk missile systems, with the systems in the hands of criminal groups threatening to use them. While the analogy is vivid, it incorrectly places all fault on the government. A more accurate rendering would be a missile in which the software industry built the warhead (vulnerabilities in their products), their customers built the rocket motor (failing to upgrade and patch), and the ransomware is the guidance system.
Malicious software like WannaCry and Petya is a scourge in our digital lives, and we need to take concerted action to protect ourselves. That action must be grounded in an accurate understanding of how the vulnerability ecosystem works. Software vendors need to continue working to build better software and to provide patching support for software deployed in critical infrastructure. Customers need to budget and plan for upgrades as part of the going-in cost of IT, or for compensatory measures when upgrades are impossible. Those who discover vulnerabilities need to responsibly disclose them or, if they are retained for national security purposes, adequately safeguard them. And the partnership of intelligence, law enforcement and industry needs to work together to identify and disrupt actors who use these vulnerabilities for their criminal and destructive ends. No single set of actions will solve the problem; we must work together to protect ourselves. As for blame, we should place it where it really lies: on the criminals who intentionally and maliciously assembled this destructive ransomware and released it on the world.
Rick Ledgett was deputy director of the National Security Agency from January 2014 through April 2017. The view expressed herein are his, and not necessarily those of the NSA.