Criminal Justice & the Rule of Law Cybersecurity & Tech

Congress Should Reform Section 230 in Light of the Oral Argument in Gonzalez

Mark MacCarthy
Tuesday, March 21, 2023, 12:09 PM

The Supreme Court appears unlikely to significantly alter immunity for digital platforms. Ho

The Supreme Court heard oral arguments on Gonzalez v. Google in February of 2021. (Jarek Tuszynski, https://tinyurl.com/3dsbnc3h; CC BY-SA 3.0, https://creativecommons.org/licenses/by-sa/3.0/deed.en)

Published by The Lawfare Institute
in Cooperation With
Brookings

The plaintiff in Gonzalez v. Google is seeking a ruling that Section 230’s grant of immunity to internet companies permits suits for aiding and abetting terrorism when a social media company recommends terrorist material. Based on the Supreme Court’s recent oral argument in this case, the prevailing consensus is that the Court will be unlikely to significantly alter the current broad interpretation of Section 230. In short, as argued in these pages back in February, it appears that the Court may have gotten cold feet about “breaking the internet.”

But not everyone is in full agreement on the matter. Stewart Baker’s review of the argument, for example, suggests that five to seven of the justices—a clear bipartisan majority—were skeptical of Google’s argument in favor of retaining Section 230 immunity as currently understood. Gonzalez won’t win his case, Baker speculates, but Google won’t prevail either. He thinks the Court will find a way to send the case back to the lower courts with implicit or explicit instructions to fix the mess they have created. 

During oral argument, the justices expressed concern that the current interpretation of Section 230 is implausibly broad, as it seems to allow an online company to participate in wrongdoing knowingly and intentionally, provided only that it uses its users’ content to do it. Under this broad interpretation, the provision would give companies immunity for promoting their users’ illegal content, even if the underlying law assigns them a duty to keep their systems free of this illegal material. The justices raised numerous hypotheticals showing that this could not have been the congressional intention. 

Justice Ketanji Brown Jackson, for instance, seemed to be channeling an amicus brief filed by law scholars Mary Anne Franks and Danielle Citron on behalf of the Cyber Civil Rights Initiative: Throughout the argument, Jackson questioned how a statute aimed at providing liability relief for companies that go out of their way to remove harmful and illegal material should somehow come to mean that internet companies are fully protected from liability when they actively promote illegal material.

But the justices also seemed reluctant to adopt the position espoused in the Department of Justice’s amicus brief, which argued that Section 230 immunity applies only when a company hosts or passively distributes user content. As an interpretation of the text of Section 230, this might be plausible. The clear intent in 1996 was to insulate online content distributors from being treated as the publisher of online content in the same way that the Supreme Court’s 1959 Smith v. California decision protected bookstores from publisher liability for selling obscene books. 

But this narrow interpretation would overturn decades of interpretation of Section 230, which—starting with the U.S. Court of Appeals for the Fourth Circuit’s decision in Zeran v. America Online in 1997—has held that Section 230 immunizes even knowing distribution of illegal material and applies beyond hosting or passively distributing user material to include any further activity that might resemble traditional publishing. The restriction of Section 230 to hosting would also render it a nullity, since almost all online content is ranked, sorted, personalized, amplified, recommended, or downgraded. The result would be a flood of new litigation and excessive removal of protected speech from online platforms.

So, the Supreme Court seemed stuck between Scylla and Charybdis. The more they avoided the whirlpool of unlimited litigation, the closer they approached the ravenous beast of absolute social media immunity. Nor was there any indication that a middle ground between the two extremes would be acceptable to at least five justices—a reality giving way to the common wisdom that the Court will probably reject the Gonzalez claim in some fashion and the strong likelihood that they will send a message to the lower courts or Congress to figure a way out of this dilemma.

A Notice Liability System

When the Court reaches its likely decision to leave the current system largely intact, what will Congress do? Would it be reluctant to act when industry is satisfied with the status quo? 

The odds are always against Congress acting, especially when it faces determined resistance from the affected industry. But Congress has already focused on reform of Section 230. On March 8, the Subcommittee on Privacy, Technology, and the Law of the Senate Judiciary Committee held a hearing on platform accountability, and all but one witness urged fundamental reform of Section 230. Lawmakers from both political parties seem convinced that there is too much illegal activity online and that tech companies are partially to blame for not doing enough to curtail it. They would be especially concerned now that the breadth of the immunity claimed by industry under Section 230 has been so clearly revealed in the Gonzalez oral argument. Even before this, Congress acted overwhelmingly in 2018 to hold online companies liable for promoting sex trafficking and to amend Section 230 to make sure they could not use it as a shield against this new liability. The problem is not the will to act but finding a reasonable way toward comprehensive rather than piecemeal reform. 

In the face of court inaction, it seems likely that Congress will turn its attention to reforming Section 230 in order to rearrange legal incentives such that online companies are more effectively encouraged to take reasonable steps to stop illegal activity or speech on their systems. How might that objective best be accomplished?

During the Gonzalez oral argument, Justice Samuel Alito wondered what would go wrong if “Google were potentially liable for posting and refusing to take down videos that it knows are defamatory and false.” This question put the issue of notice liability or distributor liability before the Court. As set out in the Smith case cited above, distributors such as bookstores and libraries can be held liable for the illegal material they disseminate only if they knew or should have known that the material was illegal. But nothing in the Gonzalez argument raised that issue specifically. So, the Court is unlikely to revisit the Zeran court’s judgment on distributor liability. In that case, the Fourth Circuit rejected distributor liability as an interpretation of Section 230, arguing that it would create an incentive to remove material upon notification regardless of the merits of the complaint. This has become a consensus view that the lower courts are unlikely to reconsider without specific instructions in the Supreme Court’s decision. 

In any case, a notice liability system requires careful design to ensure it does not become a mechanism for censorship, and this is beyond the purview and expertise of the courts. As I argued in a Lawfare article two years ago, Congress should step in to devise a balanced notice liability system.

A good model for Congress is the knowledge liability system established under the European Union’s Electronic Commerce Directive of 2002, which was recently reaffirmed and updated in the Digital Services Act (DSA). 

Article 14 of Electronic Commerce Directive says that an online hosting company is not liable for the information it stores if it “does not have actual knowledge of illegal activity or information” and “is not aware of facts or circumstances from which the illegal activity or information is apparent.” If the company obtains knowledge of illegality, it can still be immune from any liability if it “acts expeditiously to remove or to disable access to the information.” This knowledge standard applies to both copyright violations and other violations of law. The directive preserves the ability of governments to require online companies to terminate or prevent an infringement. 

Europe’s liability system has been in place for over 20 years and has created a workable balance between the needs of users and the public for protection from illegal online activity and the interest of online companies to be free of unreasonable burdens. It has allowed European courts and regulators to enforce the privacy and reputation rights European citizens enjoy. And it has allowed European courts to wrestle with difficult questions concerning whether removal orders concerning specific online privacy-invasive or defamatory material should be worldwide or extended to equivalent material. In the United States, by contrast, such vital questions could not be raised before U.S. courts because of the broad immunity from liability that Section 230 grants. In general, the European Court of Justice is correct when it says that the Electronic Commerce Directive “strikes a balance between the different interests at stake and establishes principles upon which industry agreements and standards can be based.”

After a process of consultation to review the operation and evaluate the effectiveness of this knowledge liability system, the European Union largely reaffirmed the system in its 2022 Digital Services Act. Article 6 of the DSA, for instance, repeats Article 14 of the Electronic Commerce Directive word for word. 

Recital 22 of the DSA provides some updated guidance on how this knowledge standard should be interpreted. It clarifies that social media companies cannot be considered to have obtained actual knowledge of specific illegality “solely on the ground that that provider is aware, in a general sense, of the fact that its service is also used to store illegal content.” Moreover—relevant to the current discussion of recommendations in the Gonzalez case—Recital 22 holds that the DSA knowledge standard is not met merely through a digital platform’s use of an algorithm to recommend specific user content. Recital 22 specifies that the fact that an online company “recommends information on the basis of the profiles or preferences of the recipients of the service is not a sufficient ground for considering that provider to have ‘specific’ knowledge of illegal activities carried out on that platform or of illegal content stored on it.”

The major advance of the DSA is that it defines a notice system in connection with illegal online material. Under Article 16 of the DSA, online companies must develop and maintain “easy to access and user-friendly” mechanisms whereby users can notify the companies that their systems contain “specific items of information” that the users consider to be illegal content. A properly constituted notice must be “sufficiently precise and adequately substantiated.” It must contain a “sufficiently substantiated explanation” for thinking the information in question is illegal, “a clear indication” of the exact electronic location of that information, the name and address of the user who is complaining, except in cases of complaints about sexual abuse of children and child pornography, and a statement confirming the user’s “bona fide belief” in the accuracy and completeness of the notification. 

Such properly constituted notices “give rise” to the “actual knowledge or awareness” that triggers the obligation for a social media company to take action or to lose immunity. This loss of immunity occurs only with respect to “the specific item of information concerned.” Recital 50 clarifies that users should be able to notify companies of multiple specific items of allegedly illegal content through a single notice. The loss of immunity applies only where these notices “allow a diligent provider of hosting services to identify the illegality of the relevant activity or information without a detailed legal examination.”

The European Union did not change underlying liability standards when it adopted its Electronic Commerce Directive and its new notification regime under the DSA. As Recital 17 of the DSA says, the new notification rules “should not be understood to provide a positive basis for establishing when a provider can be held liable, which is for the applicable rules of Union or national law to determine.”

Europe’s notice liability system might be a good place for Congress to start in its search for a replacement for Section 230’s overly broad protections. 

When Justice Alito raised the possibility of distributor liability during the Gonzalez oral argument, the Google attorney acknowledged that such a “situation” was present in Europe and seemed to accept that it had not produced intolerable legal burdens on online companies there. But she also noted that Europe did not have “class actions … plaintiffs’ lawyers … [and] … the tort system.” She stated that in the U.S., notice liability would lead to a “deluge” of complaints and lawsuits, which might be tolerable for Google but would cause the “collapse” of other online companies, such as Yelp. Although she did not mention it, it is likely that the European legal rule requiring the losing party to pay the winner’s court costs also helped to control European litigation against online companies allowed by the Electronic Commerce Directive. In contrast, in the U.S., each party pays its own legal expenses, thus allowing greater access to the courts. 

This possibility of a “deluge” of litigation should lead Congress to look at another model for notice liability: the notice and takedown system for online copyright infringement established in the United States in the Digital Millennium Copyright Act (DMCA) of 1998. This statute holds that an online company shall not be liable for copyright infringement for information stored on its system by a user if it “does not have actual knowledge” that the material is infringing, “is not aware of facts or circumstances from which infringing activity is apparent,” or “upon obtaining such knowledge or awareness, acts expeditiously to remove, or disable access to, the material.” In order to benefit from this immunity, the online service provider must not “receive a financial benefit directly attributable to the infringing activity” and “upon notification of claimed infringement” must respond “expeditiously to remove, or disable access to” the claimed infringing material. 

The DMCA requires strict conditions on adequate notices such as identifying the specific work infringed, a statement of good faith, and a statement under pain of perjury that the complainant is authorized to act for the copyright owner. It further requires the platform to provide an opportunity for the alleged infringer to file a counter-notice. If the alleged infringer does so, the platform must restore the removed material unless the complaining party goes to court within 10 days alleging copyright infringement. 

No one pretends that this is a perfect solution to the problem of online copyright infringement, but a consensus has developed that after almost 25 years of operation, this law has led neither to a deluge of litigation nor to excessive removal of material because of fear of litigation. As the Center for Democracy and Technology put it recently, “[A]lthough individual aspects of the law may offer incomplete solutions against infringement or insufficient protections against abuse of the notice-and-takedown system, it has generally achieved its objective of balancing the burdens and protections for internet users, intermediaries, and rightsholders.” 

To control the risks of excessive litigation or removals in devising a new notice liability system for the United States, Congress should consider how to adapt the protections embodied in the DMCA to a new immunity system that would replace Section 230. Other safeguards could be considered, including liability for wrongful removal, which might allow recourse to courts when perfectly legal material is removed in error. The law could also require platforms to take action against groups that abuse the complaint process, or allow courts to impose costs for damages, including court costs, if they conclude that a plaintiff’s claim is frivolous.

No New Liability

This proposed reform of online immunity would eliminate the language of Section 230 in its entirety and replace it with new language establishing immunity except where companies know about the illegality on their systems. This would have implications for how online cases would be addressed by the courts, the most important of which is that courts would no longer have to wrestle with whether an online company was involved in the publication of other people’s material in considering whether a complaint could go forward to an assessment of its merits. The screen would be knowledge. Did the company know that the user material it distributed or organized was illegal? If it did—either because it was evident from facts or circumstances, or because it had received an adequate notice alleging illegality—then the case proceeds. At that point, nothing would block a motion to dismiss for failure to state a case, but the special immunity granted to online companies would be gone. 

A notice liability standard does not impose a requirement for companies to take reasonable measures against the illegal material on their systems, even when the companies know about the illegality. It does not even impose a duty to take reasonable steps against illegality as a condition of obtaining immunity, as legal scholars Danielle Citron and Benjamin Wittes have suggested. The companies’ liability depends on the underlying law, and notice liability changes nothing in that regard. The underlying law might forbid third parties from aiding and abetting certain types of illegal conduct on their systems or prevent them from promoting or facilitating that illegal conduct. Or it might be completely silent on the responsibilities of third parties.

In the Gonzalez oral argument, the lawyer for the government argued that such indirect liability laws were few and far between in an attempt to allay concerns that a change in Section 230 would unleash a torrent of litigation. Be that as it may, a notice liability system should change none of the preexisting liability. It would leave it to the courts to determine where and whether it exists. 

After the Gonzalez oral argument, Blake Reid of the University of Colorado Law School noted that Section 230 has created an “interpretive debt.” There has been a deluge of litigation regarding online content over the past 27 years, but it has been diverted into interpreting whether such and such online platform activity is publication of third-party material rather than whether it amounts to a violation of law. 

A notice liability system would allow a healthy evolution of law under existing standards. Rather than a bug, this new round of litigation should be welcomed as a feature. An aggrieved user such as Matthew Herrick—who repeatedly asked Grindr to do something about harassment apparently enabled by the dating site’s communications system, to no avail, would be allowed his moment in court to seek redress. The company would be unable to have the case dismissed on the grounds that someone else was using its system for harassment and physical endangerment. Instead, the company would have to show that its system was not responsible for the problem or argue that no reasonable measures available to it could have prevented the harm.

Conclusion

A new notice liability system inevitably comes with risks. Such a regime would incentivize companies to be more careful about the content they leave up or amplify, and this might lead them to be overly cautious. But right now, the bigger problem is the proliferation of illegal material online and the vast expansion of immunity claimed by online companies to do nothing whatsoever about it, even when the underlying law assigns them a duty to keep their systems free of this material. 

The risks of curtailing worthwhile activities and speech on the internet can be addressed through careful construction of safeguards and protections. A properly designed notice liability system can provide a balanced way to incentivize companies to stop the spread of illegal speech and conduct on their platforms. And it can allow for a healthy evolution of liability law to ascertain the scope and dimension of online responsibility for illegal conduct on their systems. 

In light of the Supreme Court’s likely decision in the Gonzalez case to leave the current Section 230 immunity system intact, Congress should begin the process of examining notice liability as a replacement now.


Mark MacCarthy is nonresident senior fellow at the Brookings Institution, senior fellow at the Institute for Technology Law and Policy at Georgetown Law and adjunct professor in Georgetown’s Communication, Culture & Technology Program.

Subscribe to Lawfare