Testing Encryption Insecurity: A Modest Proposal
Co-blogger Susan Landau has yet another thoughtful post on the insecurity of back door encryption requirements -- what she calls mandating insecurity. She calls it magical thinking, which is perhaps a bit harsh. But her point is well taken -- one ground for opposing mandated back doors is, as I've said, that they seem to be technologically infeasible to maintain only for government access.
Published by The Lawfare Institute
in Cooperation With
Co-blogger Susan Landau has yet another thoughtful post on the insecurity of back door encryption requirements -- what she calls mandating insecurity. She calls it magical thinking, which is perhaps a bit harsh. But her point is well taken -- one ground for opposing mandated back doors is, as I've said, that they seem to be technologically infeasible to maintain only for government access.
So here is a thought experiment -- a useful immaculate back door would, in theory, be one that is openable by the government and only by the government. In other words one which, even if the methodology for how it was created and implemented was publicly known, could not be broken. We actually have a real world model for that construct -- public key encryption itself. The solution is robust because the "how" of it (the mathematics) can be widely published and yet the particular applications of it remain impenetrable -- cracking private messages to me encrypted with my public key remains impossible.
So I propose a simple rule (perhaps even one that Congress could enact into law) -- encryption providers may be required to adopt a government sponsored "back door" technology if, and only if, the methodology for that technology has been published publicly for more than 12 months and no efforts to subvert or defeat it have been successful. NIST gets to judge success. That way if the NSA/FBI have a real solution that can withstand public scrutiny (and, I assume, sustained attack) they can use it. Absent that ... the risks outweigh the rewards.