Supreme Court Oral Argument Preview: Social Media First Amendment Cases
Published by The Lawfare Institute
in Cooperation With
On Feb. 26, the U.S. Supreme Court will hear two cases that could change how social media companies may conduct content moderation and alter First Amendment jurisprudence. The consolidated cases, NetChoice v. Paxton and Moody v. NetChoice, assess the constitutionality of two state laws, one in Texas and the other in Florida, that limit social media companies’ ability to conduct content moderation. The states claim that social media companies are discriminating against conservatives, violating the First Amendment rights of those users. Social media companies claim such restrictions violate their First Amendment rights because they amount to compelled speech and state censorship.
The Cases
The Florida Law: Moody v. NetChoice
In 2021, the Florida legislature passed Senate Bill 7072 (SB 7072), which addresses the content moderation practices of prominent social media platforms. The law restricts platforms with more than 100 million users per month or an annual gross revenue of more than $100 million from removing or moderating the content of “journalistic enterprise[s]” based on the content of their posts, and also prohibits platforms from applying “post-prioritization” or “shadow banning” algorithms to content posted “by or about” a known political candidate. Censorship is defined broadly to include, among other things, appending notes to posts, content removal, and “shadow banning”—described in the bill as an “action by a social media platform, through any means, whether the action is determined by a natural person or an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to other users of the social media platform.”
In addition, the law requires social media platforms to apply their “censorship, deplatforming, and shadow banning” standards consistently across all users and to explain to each user who had their content removed why their content was moderated. Under the statute, the companies must publish moderation standards and explain the nature of their application. Platforms would also be required to allow users to opt out of post-prioritization and shadow ban algorithms so that the platform displays content sequentially or chronologically for that user.
NetChoice—a trade group acting on behalf of social media platforms and other online entities—filed a facial challenge to the statute and asked for a preliminary injunction while the case was being litigated. The U.S. District Court for the Northern District of Florida granted the injunction, and a U.S. Court of Appeals for the Eleventh Circuit panel affirmed the injunction while also invalidating certain provisions of the statute under the First Amendment.
The Eleventh Circuit determined that the First Amendment right to control how and what type of third-party-generated content a private platform may host applied in this context. The court relied—for this First Amendment protection—on the Supreme Court’s ruling in Miami Herald Publishing Company v. Tornillo, which held unconstitutional a statute requiring newspapers to publish replies to editorials because the law interfered with the newspapers’ First Amendment right to editorial discretion. The Court reasoned that newspapers are unlike broadcasting channels, which are “common carriers” and therefore have a special interest in ensuring neutrality. The “common carrier” label is a common law legal status for businesses, such as internet service providers, that serve a critical public function.
Distinguishing social media platforms from common carriers, the Eleventh Circuit held unconstitutional the provisions restricting the platforms’ content moderation practices. Traditionally, common law courts have determined that governments maintained a significant interest in anti-discrimination regulation with respect to common carriers. Governments have, therefore, historically regulated common carriers substantially—especially when ensuring equal access to the services of a carrier. The Eleventh Circuit panel rejected Florida’s argument that social media companies should be considered common carriers. The judges found that, unlike other public services where there is no expectation of discrimination in providing services, platforms explicitly inform users from the beginning that they will, per the platforms’ First Amendment right, discriminate against certain types of content. The panel also concluded that labeling the platforms as common carriers would not necessarily override the relevant First Amendment rights.
Finally, when analyzing the law’s individual disclosure requirements mandating that platforms thoroughly explain moderation decisions to users who had their content removed, the Eleventh Circuit based its reasoning on a test constructed by the Supreme Court in Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio. Zauderer held that governments could require advertisers to include disclosures in their advertisements to prevent advertisers from misleading consumers, and the Court created a test for determining whether the government could compel disclosure. The test permits states to compel commercial speech that is “purely factual and uncontroversial information” so long as the speech is not so “‘unduly burdensome” such that it would “chill[] protected commercial speech.” In previous cases, the Supreme Court recognized commercial speech as low-value speech that receives less protection under the First Amendment. Although the panel noted that “it is not substantially likely that the [law’s other] disclosure provisions are unconstitutional,” the panel agreed with NetChoice that individualized explanation requirements were “‘practically impossible to satisfy’” and were, therefore, “unduly burdensome and likely to chill platforms’ protected speech.”
The Texas Law: NetChoice v. Paxton
In September 2021, Gov. Greg Abbott signed Texas House Bill 20 (HB20) into law. The bill prohibits social media platforms with more than 50 million active users in the U.S. per month from taking down, demonetizing, or otherwise moderating content “based on … the viewpoint of the user or another person,” “the viewpoint represented in the user’s expression,” or the user’s geographic location. Under the law, the state attorney general or Texas social media users whose posts have been impermissibly moderated can file suit for injunctive or declaratory relief and compensation for attorney’s fees. Like the Florida law, the Texas law requires that companies disclose their content moderation and promotion policies and provide individualized explanations for content removal. Additionally, the Texas statute requires platforms to publish a biannual transparency report and maintain a complaint and appeal system—two obligations not mandated by the Florida law.
NetChoice filed suit in the U.S. District Court for the Western District of Texas to prohibit the law's implementation and sought a preliminary injunction, just as the trade group did in Florida. The district court agreed with the trade group and enjoined the law. However, a divided Fifth Circuit Court of Appeals overturned the trial court’s decision and allowed the law to stand while on appeal without explanation. In a 5-4 decision, the Supreme Court vacated the Fifth Circuit’s stay of the district court’s injunction. Ultimately, the Fifth Circuit panel ruled that the Texas law did not violate the First Amendment, determining that the law “does not chill speech; instead, it chills censorship.” After the Fifth Circuit’s ruling, the trade group appealed to the Supreme Court.
The Fifth Circuit asserted that censorship by social media companies is not a form of speech and does not fall under the First Amendment. The court cited several cases in support of this position, including Pruneyard Shopping Center v. Robins. In that case, the Supreme Court held that California could permissibly require a shopping mall to allow individuals to distribute pamphlets inside the premises—reasoning that a business establishment that holds itself open to the public has no “expressive” interest in who is allowed to speak in the establishment. The Fifth Circuit determined that platforms are establishments like malls that “h[old] themselves out to serve the public equally” and, therefore, have no expressive interest in censoring speech.
The Fifth Circuit noted in its decision that the First Amendment protects the “‘right to refrain from speaking,’” and publishers cannot be compelled to publish specific articles or viewpoints. Yet the Fifth Circuit did not recognize “editorial discretion”—the right of private organizations to control the dissemination of third-party content—as an independent right. Instead, the Fifth Circuit explained that such discretion arises only where a law compels or restricts the speech of the private party itself—whereas the Texas law concerns not platforms’ own speech, but how platforms treat users' speech.
The panel concluded that social media companies are common carriers and should be treated as such. The judges reasoned that since the platforms are ubiquitous communications providers that hold themselves out to the public, state governments can pass laws to ensure nondiscrimination in the provision of services.
Also applying Zauderer, the Fifth Circuit came to the opposite conclusion of the Eleventh Circuit regarding the state’s ability to compel speech. The Fifth Circuit argued that Zauderer only prohibited regulation that would ultimately chill speech. Because the Fifth Circuit considered censorship to be conduct and not speech, there is no threat that the disclosure requirements would chill speech. The Fifth Circuit also noted that many companies already comply with this requirement, demonstrating that compliance would not be a “heavy burden.” Finally, the Fifth Circuit addressed the platforms’ arguments that Zauderer should not apply because content moderation decisions were not “‘purely factual and uncontroversial information’” but editorial decisions. Once again, the Fifth Circuit refused to equate moderation to editorial decisions and applied Zauderer.
Notably, in an apparent attempt to partially harmonize the two rulings, the Fifth Circuit distinguished HB20 from the Florida law. The Fifth Circuit stated that the Florida law “prohibits all censorship of some speakers, while HB 20 prohibits some censorship of all speakers.” Florida’s SB 7072 prohibits any censorship applied to posts made by journalistic enterprises, which could include explicit speech from the platform, such as appending warnings—which would be permitted under the Texas law. Yet the Fifth Circuit explicitly departed from the Eleventh Circuit on the questions of whether editorial discretion is an independent right, whether the compulsion of individualized explanations passes the Zauderer test and is therefore permissible, and whether the common carrier doctrine applies to social media companies.
The Briefs
The Justice Department’s Amicus Brief
Before granting certiorari in both cases, the Supreme Court requested a brief from the solicitor general stating the government’s position. In that amicus brief, the Justice Department recommended the Supreme Court take the case but cabin its review of the laws to the constitutionality of the provisions that (a) limit platforms’ ability to moderate content and (b) require platforms to provide individualized explanations when restricting specific content. The technology companies had also appealed other disclosure requirements mandated by both laws and asserted that the state legislatures specifically targeted the companies for their political viewpoints. However, the solicitor general noted that both courts of appeals had rejected those arguments. Therefore, the government asserted, it was unnecessary for the Supreme Court to review them.
The solicitor general also encouraged the Supreme Court to affirm the Eleventh Circuit and reverse the Fifth Circuit, siding with the social media companies. The Justice Department argued that blocking and removing content is protected First Amendment activity, this moderation is part of the platforms’ editorial discretion, and governments do not have a legitimate interest in securing public access to private platforms. Additionally, the solicitor general contended that because Zauderer only permits the government to compel speech that does not impose a heavy burden, and the requirement for individualized explanations creates a heavy burden, that obligation amounts to impermissible compelled speech. The solicitor general asserted that the explanations are a “heavy burden” because, despite the existing voluntary disclosures, as “‘[t]he targeted platforms remove millions of posts per day,’” platforms would need to expand their appeals processes “100-fold” to comply with the statutes’ disclosure requirements.
Following the government’s filing, the Supreme Court granted certiorari, accepting the solicitor general’s request to limit the questions presented. On Jan. 22, the solicitor general filed a motion to participate in oral argument as amicus curiae supporting the platforms, which the Court granted.
The States’ Briefs
Florida's and Texas's briefs overlap significantly and present similar themes to one another throughout. Both states argued that editorial discretion does not apply to social media platforms, and they wrote that the laws do not regulate inherently expressive conduct because platforms act (or at least portray themselves) as a digital public square. Therefore, any moderation is akin not to speech or expression, but only to maintaining that public online space.
The states relied on two Supreme Court decisions to support this characterization of moderation. First, citing Pruneyard Shopping Center, the states equated social media platforms to malls, supporting the notion that these platforms do not have an expressive interest in censoring certain content. The states argued that just as a mall does not have an expressive interest in determining what kind of flyers are distributed on their premises, platforms do not have an expressive interest in censoring certain types of speech. The second case the states relied on, Rumsfeld v. Forum for Academic and Institutional Rights (FAIR), deemed constitutional a law requiring schools to permit military recruiters on campus. The states reasoned that since FAIR held that hosting speech was non-expressive conduct (which meant it could be regulated), the government could similarly compel social media companies to host speech on their platforms. The states contended, quoting Florida’s brief, that compelling the platforms to host content maintains the core First Amendment value of an “‘uninhibited marketplace of ideas.’”
The states asserted that social media platforms should be characterized as common carriers, which, Florida and Texas wrote, would allow the states to impose the requirements on the platforms contained in SB 7072 and HB 20. The states argued that courts and Congress have generally classified as common carriers entities that disseminate or facilitate others’ speech, including telephone companies and internet service providers. And courts have recognized that governments have a particular interest in regulating common carriers and enforcing anti-discrimination laws—as carriers play a unique and essential role in society by providing travel, communication, and other services to the general public, usually with a license from the government.
Addressing the individual disclosure provisions last, the states contended that the provisions are constitutional because the laws compel disclosure of only factual and uncontroversial information about a platform’s terms of service. The states argued that the laws’ disclosure requirements were not overly burdensome—as required by Zauderer. The states asserted that currently the companies voluntarily provide explanations for their moderation decisions, so they would only need to alter their current system slightly if they are not already compliant. Relying on Zauderer, Florida argued that the platforms have only minimal First Amendment interests in withholding moderation information, and the state has a legitimate governmental interest in requiring the disclosure. Florida also encouraged the Supreme Court to reconsider the Eleventh Circuit’s decision regarding the individual disclosure provisions given the minimal lower court record on this point. And Texas asserted that Zauderer’s heavy burden only refers to the burden on speech and not operational or administrative burdens.
With respect to the level of “scrutiny” the Court should apply to these laws, Florida argued that even if the Supreme Court decides to apply intermediate scrutiny when conducting its analysis, its law passes muster because it is content-neutral and serves the important interest of maintaining public access to the modern public square. Texas made a similar argument regarding intermediate scrutiny and added that its law could even survive strict scrutiny, stating that it was the least restrictive way of accomplishing the compelling government interest of ensuring public access to information.
Briefs From Social Media Companies
In their briefs responding to Texas and Florida, the platforms argued that both laws infringe on their First Amendment rights by preventing them from exercising editorial discretion—which they view as protected expressive conduct—in deciding when and how to disseminate speech. The platforms asserted that the laws would prevent them from removing problematic content, such as terrorist propaganda.
The companies highlighted several Supreme Court rulings in defense of their position. The companies relied most substantially on Tornillo and compared the platforms to newspaper editors deciding what content to publish or host. The platforms argued that just as the Supreme Court determined that the government cannot compel newspapers to publish replies authored by third parties, the governments cannot compel platforms to host third-party-generated content. The companies also cited other cases—PG&E v. Public Utilities Commission and Hurley v. Irish-American Gay, Lesbian, and Bisexual Group of Boston—in which the Supreme Court struck down laws that required private organizations, including ones that provide public services, to host the speech of third parties. The platforms asserted that forcing companies to host or associate with speech they believe is harmful would violate a fundamental First Amendment protection.
The companies also disagreed with the states’ argument that the platforms are common carriers, noting that the platforms, as stated in the brief regarding the Texas law, “do not ‘hold themselves out as affording neutral, indiscriminate access to their platform without any editorial filtering.’" They also argued that even if the platforms were common carriers, the laws would be unconstitutional—quoting Justice Clarence Thomas, who wrote in a previous case that labeling a project as “a common carrier scheme has no real First Amendment consequences.”
Regarding the individualized explanations to users who had their content removed, the companies distinguished Zauderer by noting that the compulsion of speech permitted by that ruling is limited to advertisements. Whereas in the social media context, the companies argued, such explanations would be heavily burdensome—as social media companies would need to expand their process significantly to comply with the law.
Although the platforms argued in their briefs that the Supreme Court should apply strict scrutiny, the platforms also contended that the Court should still strike down the laws if it applies intermediate scrutiny—as the statutes would impose a weighty burden bearing many unintended consequences, and therefore the laws are not narrowly drawn to serve a substantial state interest.
Amicus Briefs
Dozens of individuals and organizations filed amicus briefs in these cases, with some advising the Supreme Court on specific facets of the case and others attempting to provide a comprehensive review of the issues and recommendations for how the Court should rule. Some of the most salient themes across the briefs include the balance between the necessity of content moderation for the operational and financial success of the platforms and unjustified censorship of viewpoints, ideas, and people; the societal role of social media platforms and their similarities to and differences from common carriers; and the threat of hateful and violent online content.
Necessity of Content Moderation Versus Unjustified Censorship
The amici presented a range of arguments on whether the content moderation conducted by the social media platforms is necessary editorial discretion or a form of unjustified censorship. Several potentially impacted platforms argued in their briefs that the ability to moderate content is critical to their companies’ ability to function. Yelp, for example, wrote that it relies on content moderation to maintain the reliability of the reviews posted to its platform. Reddit and subreddit moderators stated that their ability to moderate content supports the First Amendment rights of all their users and the platform’s rights by allowing individuals to freely build communities and associate online as they see fit, which includes removing unwanted content. In its brief, the Internet Society, a nonprofit organization, contended that maintaining the ability to moderate can encourage speech, given that moderation and guidelines make users feel safe. Many other amici also pointed out that allowing content moderation supports freedom of association. Platforms like Reddit and Discord encourage smaller communities to form their own standards and guidelines, and moderation is critical for fostering those associations. Together, these amici asserted that a ruling in favor of Texas and Florida could undermine not only the First Amendment rights and operability of the platforms but also the rights of the platforms’ users.
Amici in support of the states’ position contended that the intent of the laws is to protect the First Amendment rights of the users and that the content moderation conducted by the platforms is not expression or speech, but conduct unprotected by the First Amendment. These briefs back the Fifth Circuit's position that prohibiting censorship will cultivate the marketplace of ideas rather than stifle it. The conservative satirical news website the Babylon Bee and its affiliate Not the Bee argued that the major platforms targeted their websites by taking down their humorous or newsworthy articles and posts while leaving up equivalent content from other, liberal sources. This selective moderation and censorship of only its content, asserted the Babylon Bee, were conducted in bad faith and caused serious harm to online discourse.
Platforms as Common Carriers, Malls, or Newspapers
Many amici supporting the states, including a brief signed by almost two dozen Republican state attorneys general and the speaker of the Arizona House of Representatives, focused on applying common carrier doctrine to large social media platforms. These briefs maintained that social media platforms hold themselves out to all, which makes them common carriers subject to nondiscrimination laws. Therefore, states can regulate these platforms as they do other common carriers. Some amici argued that because these laws maintain the neutrality of institutions vital to the public that act like modern broadcasters, laws prohibiting censorship on social media are not only compatible with the First Amendment but also support its function. Mandated nondiscrimination, according to these briefs, would ensure that all users' First Amendment rights are not infringed.
Other amici authored by think tanks, nonprofit organizations, and others challenged whether platforms should be designated as common carriers subject to common carrier rules. According to these amici, social media platforms are not merely conduits of information. These amici contended that the function of social media platforms is fundamentally expressive and distinct from other common carriers. However, these amici argued that even if the common carrier designation were applied, it would not change the First Amendment implications of the laws at issue, as the laws should still face at least intermediate First Amendment scrutiny. As one brief put it, invoking common carrier rules cannot be used as an end-run around the First Amendment.
When it comes to applying Pruneyard or Tornillo, some amici recommended that the Supreme Court define platforms as publishers similar to newspapers and apply Tornillo, giving the platforms a First Amendment right to moderate content as part of their editorial discretion. Meanwhile, other amici noted that, unlike newspapers, social media platforms only review content after a user generates and attempts to publish the content, an editorial process meaningfully different from that conducted by newspapers. These amici contended that the Supreme Court should apply Pruneyard and treat platforms like malls—areas that are held open for the public to congregate. Still other amici encouraged the Supreme Court to overrule Pruneyard altogether, arguing that even if platforms are more similar to malls, the Supreme Court’s ruling in Pruneyard is not consistent with conventional and historical understandings of the First Amendment.
Online Speech to Real-World Violence
Several amicus briefs expressed concern about the dangerous practical consequences of significant restrictions on moderation. These briefs asserted that content moderation conducted by the platforms helps curb real-world violence that flows from hateful or violent content online. One amicus brief filed by a group of national security experts (including former government officials) argued that because government actors are more limited than private actors by the First Amendment in their ability to counter disinformation and propaganda online, “[i]t is vitally important to our national security that social media platforms continue to make efforts to moderate extremist content, disinformation, and propaganda.” The First Amendment, according to this brief, is intended as a limitation on the government’s ability to infringe on the speech of private actors. Therefore, allowing private actors to conduct content moderation is critical to combating malign actors operating in privately owned spaces online. The amici contend that the platforms should be given sufficient latitude to use their technical expertise in this area to effectively fulfill this role.
The Impacts
If the Court fully accepts the states' arguments, the federal government and state governments will be granted much broader authority to regulate platforms’ moderation. Laws similar to the two at issue here would either not implicate the First Amendment or be subject only to intermediate scrutiny by courts. Meanwhile, if the Court fully accepts the NetChoice arguments, it would affirmatively establish that platforms’ moderation is expressive conduct that qualifies as editorial discretion—like that of newspapers—protected by the First Amendment.
The Court may also accept some combination of the states’ and NetChoice’s arguments. For example, the Court may hold that the government can regulate a platform’s content moderation practices but that the laws in question are not sufficiently tailored to accomplish the government's interests. The Court may also hold that social media platforms should be considered common carriers, but that defining platforms as such does not strip platforms of First Amendment rights. Any combination of these outcomes is still very likely to shift how social media platforms can and choose to moderate content and what kinds of regulations governments may enact.