Cybersecurity & Tech Surveillance & Privacy

The EARN IT Act Raises Good Questions About End-to-End Encryption

Stewart Baker
Tuesday, February 11, 2020, 2:53 PM

Critics are right that the draft legislation from Sens. Lindsey Graham and Richard Blumenthal could affect the deployment of end-to-end encryption. But the bill makes sense as social policy.

Sen. Lindsey Graham, one of the co-sponsors of the EARN IT act. (Gage Skidmore, https://tinyurl.com/yx5xe3rs; CC BY 2.0, https://creativecommons.org/licenses/by-sa/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

The battle between Congress and Silicon Valley has a new focus: the EARN IT Act of 2019. (The title is an embarrassing retronym that I refuse to dignify by repeating; you can safely ignore it.) A “discussion draft” of the bill is attributed to Republican Sen. Lindsey Graham and Democratic Sen. Richard Blumenthal. To hear the Electronic Frontier Foundation (EFF), the Center for Internet and Society, and Gizmodo tell it, the draft EARN IT Act is an all-out assault on end-to-end encryption.

The critics aren’t entirely wrong about the implications of EARN IT for encryption—but they’ve skipped a few steps. To understand the controversy, it’s useful to start with the structure of the bill. That will show how EARN IT could affect the deployment of end-to-end encryption—and why the draft bill makes sense as social policy.

The central change made by the bill is this: It would allow civil suits against companies that recklessly distribute child pornography. It would do this by taking away a piece of their immunity from liability when transmitting their users’ communications under Section 230 of the Communications Decency Act (CDA).

Originally added to the CDA as part of a legislative bargain, Section 230 was one of the best deals tech lobbyists ever struck. The CDA was a widely popular effort to restrict internet distribution of pornography to minors. Tech companies couldn’t stop the legislation but feared being held liable for what their users did online, so they agreed not to fight the CDA in exchange for Section 230. The next year, the law’s measures to protect children online were ruled unconstitutional, leaving the industry protections in Section 230 as more or less the only operative provision in the entire CDA.

Both Section 230 and the content monitoring it protects have become increasingly controversial in recent years. Conservatives believe they have been unfairly targeted for arbitrary deplatforming and demonetization; liberals complain that there are still too many right-wing proselytizers and rabble-rousers on the internet. Fewer and fewer people are happy that Section 230 gives tech companies broad discretion to do what they please about user content.

But the most recent attacks on Section 230 have been more focused. Congress first took aim at sites, like Backpage, that were accused of knowingly accepting ads that fostered prostitution and sex trafficking. In 2018, Congress enacted the Fight Online Sex Trafficking Act (FOSTA), which provided that internet companies could be held liable for assisting in sex trafficking if they “knew or should have known” what their customers were doing. 18 U.S.C. § 1595; 47 U.S.C. § 230(c)(5).

Silicon Valley’s advocates hated FOSTA, but their resistance didn’t matter. Times had changed. It was hard to argue that the big platforms couldn’t afford to monitor their users’ content. Much of the industry caved rather than look soft on sex trafficking. The bill passed with overwhelming bipartisan support.

This brings us to the EARN IT Act, which would treat child pornography more or less the way FOSTA treated sex trafficking content—by lifting the immunity of companies that don’t take reasonable measures to prevent its distribution. It’s a surprise that EARN IT came second to FOSTA. Sex work has defenders on the left and the libertarian right, after all, but no one defends child pornography. What’s more, ads touching on sex and companionship have a good deal more First Amendment appeal than child sexual abuse images, which are after all direct evidence of crime. But now that FOSTA has shown the way, it’s no surprise that a similar attack on child pornography has gained traction.

In seeking to counter this proposal, Silicon Valley and internet freedom advocates have moved away from the First Amendment paeans and “don’t break the internet” memes they used in their unsuccessful fight against FOSTA. Rather, they’re arguing that EARN IT will hurt end-to-end encryption.

How so? The short answer is that EARN IT imposes civil liability on companies that distribute child pornography “recklessly.” Victims—presumably the individuals whose images are being circulated—could sue online platforms that recklessly ignored the exchange of child pornography on their services. By itself, that rule is hard to quarrel with. Recklessness requires more than simple negligence. A party is reckless if he deliberately ignores a harm that he can and should prevent.

For anyone who has defended a tort case, being reassured that the jury has to find your client acted recklessly is cold comfort. You don’t know what facts you’ll be accused of ignoring. You don’t know what emails may have been sent to even low-level employees in your company. You don’t know what measures you’ll be accused of ignoring. And you don’t know when a trawl through the company’s email servers will show that someone somewhere treated the problem with insensitivity.

To address that fear, EARN IT offers a safe harbor to companies that follow best practices in addressing child pornography. Notably, this is more protective of social media defendants than FOSTA and not strictly necessary for those willing to trust the good sense of American juries. Nonetheless, the bill creates a commission to spell out the safe harbor best practices. To further protect industry, 10 of the 15 members of the commission must agree on the best practices, and six of the 15 must have worked at a computer service or in computer science, giving the commission members with a technical background a blocking minority. To protect the government’s interests, the attorney general is given authority to review and modify, with reasons, the best practices endorsed by the group. Companies that certify compliance with the best practices cannot be sued or prosecuted under federal child pornography laws. Companies that disagree with the best practices that emerge from this process can still claim the safe harbor if they implement “reasonable measures … to prevent the use of the interactive computer service for the exploitation of minors.”

To see what this has to do with encryption, just imagine that you are the CEO of a large internet service thinking of rolling out end-to-end encryption to your users. This feature provides additional security for users, and it makes your product more competitive in the market. But you know it can also be used to hide child pornography distribution networks. After the change, your company will no longer be able to thwart the use of your service to trade in child pornography, because it will no longer have visibility into the material users share with one another. So if you implement end-to-end encryption, there’s a risk that, in future litigation, a jury will find that you deliberately ignored the risk to exploited children—that you acted recklessly about the harm, to use the language of the law.

In other words, EARN IT will require companies that offer end-to-end encryption to weigh the consequences of that decision for the victims of child sexual abuse. And it may require them to pay for the suffering their new feature enables.

I don’t doubt that this will make the decision to offer end-to-end encryption harder. But how is that different from imposing liability on automakers whose gas tanks explode in rear-end collisions? If the gas tank makes its cars cheaper or more popular, the company will get the benefit; but now it will also have to pay damages to the victims of the explosions. That makes the decision to offer risky gas tanks harder, but no one weeps for the automaker. Imposing liability puts the costs and benefits of the product in the same place, making it more likely that companies will act responsibly when they introduce new features.

There is nothing radical about EARN IT’s proposal, except perhaps for the protections that the law still offers to internet companies. Most tort defendants don’t get judged on a recklessness standard. Juries can award damages if the defendant was negligent, a much lower bar. Indeed, in many contexts, the standard is strict liability: If your company is best able to minimize the harm caused by your product, or to bear its cost, the courts will make you the insurer of last resort for all the harm your product causes, whether you are negligent or not. Compared to these commonplace rules, Section 230 remains a remarkably good deal, with or without EARN IT.

But critics of EARN IT have focused on the role the bill provides for the attorney general Because Attorney General William Barr could modify the best practices recommended by the commission, critics say, Barr might unilaterally declare that end-to-end encryption is never a best practice when dealing with the scourge of child pornography. That would effectively prohibit end-to-end encryption, or so the argument goes.

There’s one problem with this: EARN IT doesn’t impose liability on companies that fail to follow the commission’s best practices. They are free to ignore the recommendations of the commission and the attorney general, which are after all just a safe harbor, and they will still have two ways to avoid liability. First, they can still claim a safe harbor as long as they adopt “reasonable measures” to prevent child sex exploitation. Second, they can persuade a jury that their product design didn’t recklessly allow the spread of child pornography.

The risk of liability isn’t likely to kill encryption or end internet security. More likely, it will encourage companies to choose designs that minimize the harm that encryption can cause to exploited kids. Instead of making loud public arguments about the impossibility of squaring strong encryption with public safety, their executives will have to quietly ask their engineers to minimize the harm to children while still providing good security to customers—because harm to children will in the long run be a cost the company bears. That, of course, is exactly how a modern compensatory tort system is supposed to work. Such systems have produced safer designs for cars and lawn mowers and airplanes without bankrupting their makers.

Maybe it’s time to apply the same rules to internet products and services.

The views expressed here do not reflect those of my firm or clients.


Stewart A. Baker is a partner in the Washington office of Steptoe & Johnson LLP. He returned to the firm following 3½ years at the Department of Homeland Security as its first Assistant Secretary for Policy. He earlier served as general counsel of the National Security Agency.

Subscribe to Lawfare