Haste, Waste and Choice

Dan Geer
Saturday, January 6, 2018, 7:22 PM

Four years ago, there was the Heartbleed problem, a common-mode failure among products that were compliant with a particular networking standard—products that were inherently vulnerable to attack by way of their compliance itself. Early in the discussion of it here on Lawfare, this paragraph appeared:

Published by The Lawfare Institute
in Cooperation With
Brookings

Four years ago, there was the Heartbleed problem, a common-mode failure among products that were compliant with a particular networking standard—products that were inherently vulnerable to attack by way of their compliance itself. Early in the discussion of it here on Lawfare, this paragraph appeared:

Recent headlines have been all about a coding error in a core Internet protocol that got named “Heartbleed.” It is serious. It was hiding in plain view. If it wasn't exploited before its announcement, it most certainly has been after. It is hard to fix.

But that's just software, right? The central beauty of software is its impermanence. Software is to hardware as language is to vocal cords: Just as the cure for speech you don't like is more speech, the cure for software you don't like is more software.

Hardware, by contrast, is the ground truth of a computing system. Old coders would talk about such and such a program as “running on bare metal”—meaning that the only thing going on was the execution of code those old coders wrote themselves. We've gotten away from that model except in the security community, where the very permanence of hardware remains tantalizing since, in this particular line of thought, security in the hardware itself is not subject to subversive exploitation.

But sometimes the line between hardware and software is not a bright line; sometimes it depends upon what the meaning of the words are in context. And so we come to two methods of attack that fall right on the gray boundary between hardware and software, the fraternal twins Meltdown and Spectre. How they work, much less how they differ, is irrelevant (here) except for that of their position on the boundary between hardware and software, the boundary where the demands of software steer the design of hardware. The last paragraph in the announcement of Spectre reads like this:

The vulnerabilities in this paper, as well as many others, arise from a longstanding focus in the technology industry on maximizing performance. As a result, processors, compilers, device drivers, operating systems, and numerous other critical components have evolved compounding layers of complex optimizations that introduce security risks. As the costs of insecurity rise, these design choices need to be revisited, and in many cases alternate implementations optimized for security will be required.

And there is the crux of the matter, both for technologists and for policy makers: What do we prioritize? We know, and have long known, that optimality and efficiency are the enemies of robustness and resilience. The payback on optimality and efficiency is quantitative, calculable, and central to short-term survivability. The payback on robustness and resilience is qualitative, inestimable, and central to long-term survivability. The field of battle is this: All politics is local; all technology is global.

In the most weighty matters, the question of greatest import is all but always "What did they know and when did they know it?" followed by "And what did they then do?" Meltdown and Spectre noisily raise those very questions, and it is time we answered not something so ultimately trivial as what to do about this or that flaw we just now know about, but what do we want to do about our vulnerability to flaws we don't yet know about. I have written a forthcoming paper for the Hoover Institution's Aegis Paper Series that discusses this in more detail.


Dan Geer has a long history. Milestones: The X Window System and Kerberos (1988), the first information security consulting firm on Wall Street (1992), convenor of the first academic conference on mobile computing (1993), convenor of the first academic conference on electronic commerce (1995), the “Risk Management Is Where the Money Is” speech that changed the focus of security (1998), the presidency of USENIX Association (2000), the first call for the eclipse of authentication by accountability (2002), principal author of and spokesman for “Cyberinsecurity: The Cost of Monopoly” (2003), co-founder of SecurityMetrics.Org (2004), convener of MetriCon (2006-2019), author of “Economics & Strategies of Data Security” (2008), and author of “Cybersecurity & National Policy” (2010). Creator of the Index of Cyber Security (2011) and the Cyber Security Decision Market (2012). Lifetime Achievement Award, USENIX Association, (2011). Expert for NSA Science of Security award (2013-present). Cybersecurity Hall of Fame (2016) and ISSA Hall of Fame (2019). Six times entrepreneur. Five times before Congress, of which two were as lead witness. He is a Senior Fellow at In-Q-Tel.

Subscribe to Lawfare