Cybersecurity & Tech Surveillance & Privacy

Apple's Going Dark Doublespeak

Susan Hennessey
Wednesday, February 3, 2016, 9:37 AM

“Going Dark” is really two separate, if related, debates. Apple is looking to be the voice of the tech industry on both. The problem is that the company can’t seem to keep its story straight, and it is adopting divergent and incompatible positions in the two.

Published by The Lawfare Institute
in Cooperation With
Brookings

“Going Dark” is really two separate, if related, debates. Apple is looking to be the voice of the tech industry on both. The problem is that the company can’t seem to keep its story straight, and it is adopting divergent and incompatible positions in the two.

There is, first off, the technological debate surrounding encryption and governmental access. This debate is both complicated and necessary. It implicates the prioritization of online security versus physical security, core notions of privacy, and potentially the very foundations of the internet itself. It’s a hard question—and the correct outcome still isn’t clear, to my mind at least. But it would appear that everyone is fundamentally on the same side. We all want to stop bad guys. We all want to keep good guys safe online. Reasonable minds differ only on how to accomplish one without doing so at the expense of the other.

Apple has best articulated its position in this debate in its submission on the U.K. Investigatory Powers Bill. There, Apple notes its “long history of cooperating with the U.K. government on a wide range of important issues.” “Apple is deeply committed to protecting the public safety and shares the Government’s determination to combat terrorism and other violent crimes.”

Apple’s core objection to the type of cooperation the U.K.—and governments elsewhere—seek in obtaining exceptional access is that Apple is trusted by customers with “their most personal information” and allowing government access to this data necessarily means opening up customers to risks from “[i]ncreasingly sophisticated hacking schemes and cyber-attacks.” Apple believes that strong encryption is “the best way to protect against these threats.” “We are committed to doing everything in our power to create a safer and more secure world for our customers. But it is our belief this world cannot come by sacrificing personal security.”

This is an entirely reasonable position by a leading technology company, and the U.S. and like-minded governments must grapple with this perspective in charting the future course. The government cannot wish away representations by cryptographers and industry leaders regarding the limits of technology using magical thinking—often derided as the “math harder” model. Of course, that doesn’t end the debate. Citizens collectively, through elected representatives, get to dictate our security values; Silicon Valley mega-corporations do not. But, as we make those decisions, all facts are friendly. And the tech industry is well-positioned to know the tech facts.

But there is another Going Dark debate—one that has basically nothing to do with technology. This debate is harder to define but, generally speaking, it deals with the appropriate scope of government access to data, irrespective of technological implications.

Apple is entering this debate as well. As has been widely-reported, in October, Apple challenged a court order requiring it to unlock the iPhone of a (now-confessed) methamphetamine dealer to execute a search warrant. In that case, the device ran iOS7 and Apple not only had the technical capability to unlock the phone but has done so on approximately 70 prior occasions. This time Apple objected on a number of bases, most notably, that complying with the government order would cause “reputational damage” that would tarnish the brand:

[P]ublic sensitivity to issues regarding digital privacy and security is at an unprecedented level. This is true not only with respect to illegal hacking but also in the areas of government access—both disclosed and covert. Apple has taken a leadership role in the protection of its customers’ personal data against any form of improper access. Forcing Apple to extract data in this case, absent clear legal authority to do so, could threat the trust between Apple and its customer and substantially tarnish the Apple brand. This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue.

It is difficult to square Apple’s position here with the one it takes in the U.K. In the U.K., Apple restates a broad and abiding commitment to assisting law enforcement pursuant to legal process and protections. It objects to legislative directives that might require subverting security standards. But in the New York case, the assistance requires no backdoors, no increased system complexity, no heightened technical risks whatsoever. And yet, Apple argues that complying with the court order hurts its image. In other words, the broad and abiding commitment “to doing everything in our power to create a safer and more secure world for our customers” does not extend to assisting federal authorities—who have obtained a search warrant that incontestably complies with all constitutional and legal mandates—investigate a drug trafficker.

Of course, Apple is not so bold as to claim outright that complying with legal process results in reputational harm. Rather, Apple claims that doing so absent “clear legal authority” would threaten the trust relationship with its customers. But this gets to the heart of the truly remarkable claim Apple is advancing: that a search warrant issued by a federal judge—under circumstances Apple deemed sufficient at least 70 times prior—not only fails to qualify as “clear legal authority” but that it is so far from clear legal authority that to comply would undermine Apple’s credibility.

This all raises the questions: what does Apple view as necessary to preserve its reputation with customers? And how does Apple’s view on brand protection influence the Going Dark debates generally?

There are a few basic approaches a technology company might take to sharing customer information with law enforcement. It can voluntarily share customer information, it can share information only when compelled, it can challenge legal compulsion at every available instance, or it can seek to actively thwart law enforcement access technologically or by outright refusing to comply with an order.

Certainly, there is some reputational harm to tech companies in being seen as in bed with law enforcement. If consumers perceive that giving their personal data to a company is tantamount to handing it to the government, they will take their business elsewhere. We’ve seen these fears play out in the debate surrounding recent cybersecurity sharing legislation.

Traditionally, technology companies have managed the legal risks and reputational hazards of law enforcement relationships by adopting broad policies of only complying with court orders. The default response to law enforcement requests is “come back with a warrant.” This permits companies to defend—and be perceived as defending—their customers rights and information and immunizes at least some of the legal risks of sharing. The approach shifts responsibility for determining appropriate sharing away from companies and onto the legal and political process. Companies protect customers by preventing law enforcement from circumventing legal process, but also do not obstruct it.

Congress accommodates this policy by passing targeted legislation mandating sharing; for example, statutes which require reporting evidence of child pornography or proposed measures extending those rules to terrorist activity. And for their part, companies smooth the way by publishing legal guidance for complying with law enforcement requests.

But Apple’s behavior in EDNY presents a heightened conception of what is required to in order to preserve reputation. Requiring legal process is insufficient; rather a company is now apparently obligated to challenge court orders on behalf of customers wherever there is an even a remotely plausible case to be made.

The defendant in the EDNY pled guilty before a decision was rendered, though the government has argued the issue is not moot. But whatever the strength of those legal arguments—a reinvention of the scope of All Writs paired with a new interpretation of CALEA—the fact is that these arguments are novel even to Apple. The company has considered precisely this legal process to be sufficient in many dozens of documented cases. So what is new here, that Apple would suddenly seek to litigate a new interpretation, especially where technological advances are likely months away from rendering the question entirely moot?

In the wake of Snowden, a kind of anti-government publicity arms race has emerged in Silicon Valley. Companies have rushed to disavow any perception of cooperation with law enforcement, and to adopt a position overtly contrary to any kind of surveillance. For example, Mozilla now directs users to tips to “get smart” on avoiding government surveillance when using the company’s web browser. In EDNY and elsewhere, Apple takes anti-government publicity to a new level. Apple CEO, Tim Cook has emerged as the most vocal opponent of Going Dark efforts, issuing numerous statements and criticizing the White House for its “lack of leadership.” Tellingly, Apple’s U.K. submission was circulated to media outlets weeks prior to its public release.

Apple may sincerely oppose government surveillance, for reasons unrelated to its economic interests. Certainly, Apple is not motivated by some desire to protect child pornographers and terrorists. Reasonable people, with reasonable motivations, believe governments should have less, not more, access to information. But the fact remains that Apple’s express and essential position is that preserving its business interest requires that it vocally and legally oppose government access for the benefit of consumer perception. This fact is consequential, both as the U.S. and U.K. consider Apple’s input on surveillance and encryption debates and also as Silicon Valley companies decide who, in fact, speaks for their industry.

Apple’s assertion of reputational harm in EDNY makes it position on the U.K. bill less-than-forthright, assuming the two stances are not just fundamentally incompatible. In the U.K., Apple specifically objects to Clause 31 of the proposed Investigatory Powers Bill, which would require “a relevant operator take all reasonable steps for giving effect to a warrant.” Apple alleges that this language will be read such that companies would be obligated to remove end-to-end encryption and seeks a clarification to distinguish “reasonably practicable steps” from “technical feasibility.” But they frame this as a technical objection, and reiterate the general commitment and importance of assisting law enforcement.

The problem is that the language and effect of the U.K. provision is approximately equivalent to the All Writs Act. Under 28 U.S.C. §1651, “[t]he Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” Under both provisions, pursuant to legal process, individuals and companies must assist law enforcement in carrying out a court-issued warrant. How then, does Apple consonantly represent that it agrees in the U.K. with the principles underlying the provision—the obligation to assist law enforcement—while at the same time arguing in the U.S. that complying with nearly identical process amounts to reputational harm? The notion that Apple’s objection in New York hinges on a newly discovered interpretation of the interaction between CALEA and All Writs simply strains belief; and that argument considered in light of the simultaneously expressed publicity benefits, outright breaks it.

A more honest submission to the U.K. would say that Apple is happy to assist law enforcement, and will do so, but only after exercising every available legal opposition to complying with a court order, no matter how farfetched. And there isn’t anything necessarily wrong with that. But that is materially different than the position set forth, and the shift is consequential to the Going Dark debate, generally. The logical extension of preserving reputation through opposition to government access is either to refuse to comply with court orders or to create the conditions that make complying with court orders impossible.

There’s no indication Apple did not intend to comply with the ultimate EDNY judgment. However, there are examples of companies adopting precisely this approach and capitalizing on the publicity. For example, Whatsapp was suspended in Brazil for refusing to comply with a court order there. In the U.S., Yahoo faced daily fines for refusing to comply with a FISC order. While Yahoo litigated against the government in secret, when it finally secured public release they trumpeted their efforts on behalf of users: “The released documents underscore how we had to fight every step of the way to challenge the US government’s surveillance efforts[.]”

Apple is creating and intends to preserve, as it stated in its U.K. filing and elsewhere, technological conditions blocking lawful governmental access. One can both accept all of Apple’s technical opposition to backdoors—that the company genuinely believes, as many experts do, that any modification to encryption technology compromises security—and still recognize that this position is also an extension of its business model. Apple bests its competitors by being able to tell its customers, “Not only do we oppose and actively fight the government but even if we wanted to, there’s no way we could help law enforcement.” And this model requires being perceived as offering no constructive cooperation, technical or otherwise.

There was plenty of outrage in response to FBI Director Jim Comey’s statement to the Senate Judiciary Committee that he is convinced that encryption “is actually not a technical issue, it is a business model question.” But the “business model” at issue is not only the model of adopting strong encryption, it’s also a public relations strategy that requires Apple be perceived as oppositional to law enforcement. This is the business model that requires that Tim Cook berate (and publicize that he’s done so) a high-level government envoy seeking to engage Silicon Valley leaders in a conversation. And this business model allows no conversation, no engagement, and no compromise—technologically or otherwise. It is a business model that American consumers have to accept or reject—entirely apart from where we ultimately come down on the security question of encryption and governmental access.

Apple is a U.S. company that avails itself of the protections of U.S. law and seeks law enforcement assistance when it is the victim of a crime. Some would argue—and the Department of Justice does argue—that it would be antithetical to “basic civic responsibility” to permit Apple to assert reputational harm as a shield from complying with the same laws from which it derives benefits. But plenty of others support Apple’s approach and values; they encourage technology companies to “decline the []invitation to join the national security state” and otherwise advocate for limiting governmental access to personal information.

But ideological commitments should not be confused for scientific arguments.

It’s a losing proposition to dismiss outright Silicon Valley’s significant business equities at stake in Going Dark. Reputational harm matters. But, recognizing the technological realities, reasonable minds still differ as to the understanding of what our social contract requires of corporations. In a December op-ed in the Washington Post, SSCI Chairman Richard Burr called for “Congress and technology companies to discuss how encryption—encoding messages to protect their content—is enabling murders, pedophiles, drug dealers, and increasingly, terrorists.” He noted that lawmakers want to work with leading technology companies but fear they may balk.

It’s not entirely clear whether, deep down, Apple objects to exceptional access for the government only because, as it claims in its U.K. submission, such access will “weaken security for hundreds of millions of law-abiding customers.” It may believe its own statement that the information it provides to law enforcement helps “catch criminals and save lives,” and also that weakening encryption still comes at too high a cost. The depth of its commitments to these ideals may be eventually revealed by Apple’s statements on these matters in Mandarin—as it frantically works to break into the Chinese market—and by its representations to China’s government. Do they intend to take their EDNY position there, when authorities demand access to phones based on Chinese legal process? And if the Chinese decline to follow a U.S. example and—despite the impassioned, but dubious insistence by privacy advocates that Shanghai will take up the U.S. model—elect to adhere to legislation mandating precisely the kind of backdoors that threaten the fundamentals of encryption security, will Apple leave that market?

What is clear is that Senator Burr’s fear that companies like Apple will balk at any overtures—on even potential non-technical solutions—has certainly come to pass. And that reality is far more in line with Apple’s assertion in U.S. district court that perceptions that it is doing anything less than all it can to oppose governmental access damages its street cred. That’s an altogether different matter than opposing weakened encryption standards—and everyone is better served by keeping Apple’s various positions straight.


Susan Hennessey was the Executive Editor of Lawfare and General Counsel of the Lawfare Institute. She was a Brookings Fellow in National Security Law. Prior to joining Brookings, Ms. Hennessey was an attorney in the Office of General Counsel of the National Security Agency. She is a graduate of Harvard Law School and the University of California, Los Angeles.

Subscribe to Lawfare