Courts & Litigation Cybersecurity & Tech

Privacy Protections of the Stored Communications Act Gutted by California Court

Stephanie Pell, Richard Salgado
Wednesday, August 21, 2024, 2:42 PM
A California court of appeal has eviscerated statutory privacy protections that prevent providers from disclosing the content of user communications.
Data privacy. Dec. 17, 2014. (Blue Coat Photos, https://commons.wikimedia.org/wiki/File:DataPrivacy.jpg, CC BY-SA 2.0)

Published by The Lawfare Institute
in Cooperation With
Brookings

On July 23, the California Court of Appeal for the Fourth District issued a whopper of a decision that looks to upset decades’ long understandings of how users’ data is protected from disclosure by providers under the Stored Communications Act (SCA). It eviscerates the SCA’s prohibitions that prevent communication platforms from disclosing user communications and other content generally, including selling the content and, in some circumstances, providing it to governmental entities without a search warrant. It also overwhelms the considered, specific exceptions to the prohibitions that Congress crafted that will have ripple effects globally. While the appellate court takes solace in its belief that there are other mechanisms that might help fill the privacy-protection void it created, the decision diminishes the original comprehensive coverage of the SCA to a shadow of what Congress intended.

The companies directly involved in the case, Snap and Meta, are seeking review by the California Supreme Court of the sweeping decision. The California Supreme Court should exercise its discretion to review the case and reverse the court of appeal. The decision should also serve as a clarion call to Congress to update the SCA, a foresighted statute when first enacted in 1986, to account for the modern communications services on which the world relies, address any gaps or deficiencies, and buttress the statute for durability over the next 40 years.

The Case

The underlying case involves the criminal prosecution of Adrian Pina, who was indicted for murdering his brother, Samuel, by the San Diego County district attorney (DA). Pina is also charged with the attempted murder of another man, as well as possession of a firearm by a felon. He is currently awaiting trial. Defense counsel for Pina sent a subpoena to Snap, the corporation that operates Snapchat, and Meta, the corporation that operates Facebook and Instagram, “seeking social media posts and other communications made by Samuel on those platforms in the two years prior to his death.” Pina’s counsel asserted that “Samuel’s social media accounts might contain relevant evidence to support Pina’s defense.” The court agreed and indeed found that there was “probable cause.” (Presumably the court meant there was probable cause to believe there was exculpatory evidence in Samuel’s accounts.)

Both Snap and Meta objected and pointed out that the SCA prohibited them from disclosing the content of the victim’s communications to the defendant unless one of the statutory exceptions applied. None did. They also indicated that the government may be able to acquire the information through a search warrant.

Seemingly hellbent on trying out a novel theory first assembled in California courts years earlier, the court of appeal had other ideas. Before diving into that, it’s important to understand the relevant parts of the SCA.

A (Mostly) Painless SCA Primer

The Purposes of the SCA

The SCA was passed in 1986 as part of the Electronic Communications Privacy Act, with the goal of establishing privacy protections for electronic communications held by providers, the disclosure of which was “largely unregulated and unrestricted” by law that was “hopelessly out of date” at the time. Congress recognized that this void presented many serious problems: (1) it unnecessarily discouraged potential customers from using innovative communications systems; (2) it encouraged unauthorized users to obtain access to communications to which they are not a party; (3) it could disincentivize companies from developing new innovative forms of telecommunications and computer technology; and “most importantly,” (4) it would promote the gradual erosion of the “precious right” to privacy.

To achieve these goals, this remarkably forward-looking 1986 statute did its best to define broadly the types of providers and data covered with the laudable intent of being future-proof.

Providers Covered by the SCA

The SCA recognized two categories of service providers: electronic communications service (ECS) providers and remote computing service (RCS) providers. ECS providers are those that provide services for users to send or receive electronic communications. RCS providers are those that provide to the public computer storage or processing services by means of an electronic communications system. These distinctions probably made more sense in 1986, when the world didn’t have nearly the range of rich services it does today. Those of a certain age will likely remember these relatively harsh days of email services that used the POP3 (Post Office Protocol 3), which required users to log in, download the email they would like to read and keep within, say, 30 days of delivery to the account, or it would be deleted. Unlike now, when users expect to store and access email for time and eternity, leaving anything on a provider’s server meant almost certain destruction as the clock ticked; it was not a place to rely on for safekeeping or later access.

Data Protected by the SCA

The statute categorizes data into three main types. This case is about the first category, but we describe each for completeness.

  • First, there’s stored electronic communications content. “Content” can include a lot: the body, subject lines, and vanity names in email, a text message, a photo, a video, a document, search queries, prompts for artificial intelligence (AI) text or image generation, and more. For a company, university, or governmental entity that uses cloud service providers, content would include all that information for its users, plus the data uploaded to a cloud instance for, by way of example, running the operations. In today’s world, there is a lot of content in the hands of service providers, and as organizations look to rely on AI and use their data for training models, the amount of data in the hands of service providers is only going to increase.
  • Second, there’s subscriber identifying information, which is broken into six types such as name, addresses, credit card number, and so forth. SCA practitioners call this basic subscriber information (or “BSI” for those wanting instant cred by dropping an acronym).
  • Third, there’s a catch-all category for any other records that are neither content nor one of the six user-identifying information types. Practitioners of the SCA call this “transactional data,” while the rest of the world calls it “metadata.”

What the SCA Does

As relevant here, the SCA does three things, all to protect user privacy. This case is about the second purpose, but it has serious knock-on effects for the other two.

  • First, it criminalizes unauthorized access to communications systems to obtain communications content that is in electronic storage.
  • Second, and critically here, it restricts when ECS and RCS providers are permitted to disclose the types of data covered by the SCA.
  • Third, the SCA sets out the rules that governmental entities in the United States (such as a state police department, sheriff, or the FBI) must follow to compel an ECS or RCS provider to disclose information.

Although this case arises directly out of the second SCA function, protecting site user data from disclosure by providers, it has implications for the others as well. We’ll address all three, but let’s start with the ruling as it relates to the prohibition on providers to disclose content of user communications.

The Prohibition on Content Disclosure and the Trial Court’s “Defense Search Warrant”

At its heart, the SCA is a federal privacy statute. Simply stated, the SCA generally prohibits providers from disclosing the content of user communications. There are some specific statutory exceptions, none of which is at issue in this decision. Congress’s intention for the prohibition to stand strong, absent the application of one of the explicitly listed exceptions, is made clear, however, by two exceptions that one might have thought were inherent: one that allows the provider to disclose communications to an addressee or intended recipient, and another allowing the provider to disclose communications when doing so is necessary to provide the service. By being so comprehensive, Congress clearly intended its list to be exhaustive with no room for implied or manufactured exceptions.

The statute’s prohibition on disclosure of stored content applies to ECS and RCS providers in slightly different ways. A user’s content held by an ECS provider is protected from disclosure by the provider if it is in “electronic storage.” One would be forgiven for thinking that content is in electronic storage if it is stored electronically. Like so many surveillance statutes, however, a lot of important detail is packed into the definition of terms, stripping the terms of colloquial meaning. “Electronic storage” means “any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof” and “any storage of such communication by an [ECS] for purposes of backup protection.” The “backup protection” clause is the most important here, as we discuss later.

An RCS provider is forbidden from disclosing the contents “carried or maintained on that service” “solely for the purpose of providing storage or computer processing services to such subscriber or customer, if the provider is not authorized to access the contents of any such communications for purposes of providing any services other than storage or computer processing.”

These prohibitions became relevant in Pina’s prosecution when the defendant, through counsel, issued subpoenas to Snap and Meta seeking two years of the alleged murder victim’s communications. The defense intends to argue that the defendant acted in self-defense and would seek to prove his brother’s violent propensity through content in his brother’s stored communications. When presented with the subpoenas, both Snap and Meta pointed to these prohibitions on disclosure of content. Both providers asserted that the services at issue fell within the statutory definitions of ECS and RCS, and the service of a defense subpoena triggered no exception to the user’s content privacy protections.

The trial court took a creative, if messy, approach to resolve the issue. It held that the public defender’s office was a “governmental entity” under the SCA. Then, in what must have been the court’s attempt to give defense counsel the equivalent of a search warrant, the court found there was “probable cause” that the content sought was relevant. Based on its conclusion that this meant all the SCA and other issues “fall by the wayside,” the court then ordered Snap and Meta to produce the content to the trial court, which would then undertake a relevance review and share with the defense only the content deemed relevant by the judge. Both providers asked the court of appeal to review, and the court agreed. The court heard the case and then took a very different approach.

Before we go on to talk about the appellate ruling and reasoning, we should quickly note that, for its part, the DA could have ended the entire matter, including the due process and fair trial constitutional issues raised by the defense, very quickly and very early on. The DA had the power to seek a search warrant, the issuance of which would have triggered an exception to the SCA disclosure prohibition. Indeed, the court finding that there was “probable cause” leaves no doubt that the judge would have issued a search warrant had the DA filed an application. The DA chose not to do so. It seems that the trial court, unwilling or unable to force the government to apply for a warrant to seek potentially helpful or exculpatory evidence, created its own version of a probable cause-based order for the defense, believing this addressed the statutory and constitutional issues.

The Court of Appeal’s “Business Purpose Theory”

The court of appeal acknowledged the prohibition in the SCA and held that the “probable cause” approach taken by the trial court was incorrect. The court of appeal instead took a very different approach, focused on what a user allows a provider to do with their content.

Using the shorthand label “business purpose theory” for its approach, the court concluded that if a provider has permission to access content—for example, to scan for malware or instances of child sexual abuse material, pay for the service through advertising, or improve the product—and a motive to have such permission includes considerations of profit, the user loses the statutory nondisclosure protections for that content. The appellate court’s logic is only slightly different for content held by an ECS than that held by an RCS.

ECS, Electronic Storage, and Backup Protection

With respect to ECS, the court recognized that Meta and Snap “store the content of their user’s communications incidentally to transmission and for purposes of backup” in accordance with the SCA’s definition of “electronic storage” but then indicated that the two companies “also maintain that content for their own business purposes.” The court concluded that this “dual purpose brings the content outside the SCA’s plain definition of ECS provider because the content is held and used by Snap and Meta for their own profit-driven purposes.”

No other court has held anything like this. Indeed, both the Ninth Circuit and the Fourth Circuit have held that a provider that holds data so the user can access it is a form of “backup protection,” meaning the content remains in “electronic storage” and the provider continues to operate as an ECS with regard to the content. Further, nowhere in its definition of “electronic storage” or in the disclosure prohibition language (or anywhere else for that matter) does the statute say that the “purposes of backup protection” clause means there can be no other purpose, or must be for the purpose of the provider and not the user.

Perhaps most important, why the court saw a “dual purpose” at all is perplexing. Part of an electronic communications service can include efforts to make it more secure, safer, or to improve it over time. There is also no reason to think that a service that is advertising supported, rather than based on a subscription fee or some other revenue model, should mean that its users lose statutory privacy protections. It’s all part of the service, not some separate endeavor.

Similarly, the appellate court failed to explain how any business that offers a modern online platform can avoid triggering the court’s newly crafted “dual profit-driven purpose” exception to the definition of ECS and the disclosure prohibition. Nearly every act taken by a for-profit business could be described as having in part a “profit-driven purpose.” The decision simmers with implicit disdain for the business models employed by modern communications platforms and services, but that does not comport with the statute or the legislative intent. When drafting the SCA, Congress clearly understood that the services would be offered by the private sector, and indeed part of the rationale for extending the protections was to make sure users could trust the providers, thereby building the commercial market for services and encouraging private-sector innovation. Congress never provided or indicated that content is to fall outside the definition of “electronic storage” or the disclosure prohibition based on the commercial versus nonprofit status of the provider, nor did Congress dictate what a commercial model should look like.

RCS, Content Storage, and Processing

Having thus dispatched the prohibition based on the ECS provisions, the court of appeal next took on the disclosure prohibition applicable to content held by an RCS (that is, computer storage and processing services offered to the public). The SCA provides that the prohibition applies to content carried or maintained by the provider “solely for the purpose of providing storage or computer processing services to such subscriber or customer, if the provider is not authorized to access the contents of any such communications for purposes of providing any services other than storage or computer processing.”

The court of appeal held that the permissions that users gave Snap and Meta to access the content were not “solely” for providing the RCS services but included profit-motivated purposes. Thus, the court of appeal concluded, the RCS prohibitions no longer apply to Meta or Snap (or arguably any other commercial platform that is authorized by users to access content). The decision doesn’t explain why these profit-driven purposes cannot also be part of the storage and processing services, or why having a commercial motive disqualifies a service from being a storage or processing service.

The court of appeal also did not explain why a provider “provid[es a] service other than storage or computer processing” when it analyzes content stored and maintained on the system in order to present relevant ads, look for illegal content, or identify security risks. These aren’t services “other than storage or processing”; they are just other forms of “processing” under the statute, which itself is agnostic about the purpose of processing. 

The court also struggled a bit with the restrictive conditional clause. That clause provides that the “solely for the purposes of” condition applies only “if the provider is not authorized to access the contents of any such communications for purposes of providing any services other than storage or computer processing.” In other words, if the user has given access permission to the provider allowing the provider to use the data for non-RCS purposes, the content retains the disclosure protection.

What Hath the Court of Appeal Wrought

The implications of this novel “business purpose theory” for the privacy of the data held by providers, if adopted broadly, is monumental for the vitality of the SCA.

First, the ruling could remove enormous swaths of data that billions of users have entrusted with providers from the protection of the SCA non-disclosure provision and render nearly meaningless the carefully defined exceptions that Congress set out. Providers, unshackled from the SCA’s prohibitions on disclosure of content, would be able to share the data without fear of running afoul of the statute. This includes selling the content of communications to data brokers, handing it over to governments, other companies, or anyone really. There may well be other sources of restriction under a patchwork of other (sometimes conflicting) laws, but the decision undoes the comprehensive approach Congress set out nearly four decades ago.

Under the sweeping rationale of the decision, this newfound freedom from SCA liability to act promiscuously would not be bounded by a responsibly scoped set of access permissions. For example, consider a commercial email service provider that secures the permission of its users to scan stored email content and attachments for malware or for instances of known child sexual abuse material. Under this decision, the provider is not acting as an ECS with regard to the scanned data since the content is not stored exclusively for backup purposes (thus not in “electronic storage”). And although the data is held by a RCS, the RCS disclosure prohibition would not apply since the content is not maintained “solely” for storage and processing. Thus, the data escapes the protective walls of the SCA disclosure prohibition entirely, not just as pertains to the reasonable permissions granted by the user.

Second, the ruling exposes data to compelled disclosure, not just to criminal defendants in cases such as this, but to any litigant. It’s not hard to imagine the floodgates opening up as litigants seek private communications content from providers in divorce, breach of contract, defamation, and any other conceivable dispute. And this could include the content of users who are not parties to the litigation at all. A litigant could circumvent the current approach where they must go to a user directly, a process where the user has an opportunity to assert objections, seek protective orders, and otherwise control discovery of their personal communications as if they were held in their own house. Further, courts could issue gag orders preventing the provider from notifying the users even after the fact, at least for some period of time. In short, providers could be converted to litigation discovery platforms.

Third, the ruling guts another type of SCA protection for communications held in electronic storage: the provisions in Section 2701 that criminalize unauthorized access to content that is in “electronic storage” on a ECS provider’s network. The decision removes a huge percentage of contents from the definition of “electronic storage.” A hacker who compromises a system and takes all the email, documents, photos, spreadsheets, and other files stored by the user would no longer need to fear criminal liability under the SCA. In theory, the hacker could still be charged under other less tailored statutes with different elements to prove, such as the Computer Fraud and Abuse Act, but the specific statute that Congress intended to apply would no longer be available.

Fourth, the ruling could massively reduce the statutory protection the SCA provides users when their data is sought by governmental entities from around the world, including in the United States. The disclosure prohibition acts as a “blocking statute,” through which Congress prohibits disclosure of data to governmental entities except under specific procedures and conditions. The decision contracts the scope of data protected to a far smaller set of communications, opening everything else up to disclosure to all sorts of governmental bodies globally. Of course, the court of appeal ruling does not and could not alter Fourth Amendment protections that users have that could require governmental entities in the United States to secure a search warrant (or similar protections that may exist in state law, like the California Electronic Communications Privacy Act). But the decision undoes the careful work of Congress to require by statute that a warrant be obtained by any governmental entity in the United States (federal, state, or local) to compel a covered provider to disclose communications content, and it creates a Gordian knot of jurisdictional complexity.

Fifth, by reducing the classes of content subject to the disclosure prohibition, the decision undermines the advancement of significant public policy goals that Congress intended to promote through the carefully drafted and specifically enumerated exceptions to the prohibition. This is a big deal, and it impacts the privacy of people worldwide who use U.S. provider services.

For example, in one exception, Congress sought to advance the lofty goals of encouraging the global free flow of information and the adoption by other countries of surveillance laws and practices that are respectful of human rights and the rule of law. Congress did this in the CLOUD Act, by establishing a framework through which the U.S. government could enter into executive agreements with another country that meets minimum standards on human rights and the rule of law. Once a CLOUD Act agreement is reached between the U.S. government and another country, the exception to the SCA would kick in to allow, but not require, U.S. companies to disclose content in response to legal process from government agencies in that country investigating serious crimes. The statute was carefully constructed to continue to prohibit the disclosure of U.S. user data under this exception. After many years of negotiation, the United States has agreements in place with the U.K. and Australia. Hopefully many more are to come.

By excluding an enormous class of user content from the scope of the prohibition, the novel “business purpose” rule makes it far less important for foreign governments to improve their human rights records to qualify for a CLOUD Act agreement. It threatens to stall adoption of a mechanism that, unlike any other on the horizon, has the potential to help address investigation of serious crimes while protecting privacy and liberty interests and reducing the temptation for countries to pass aggressive surveillance regimes and harmful data localization laws.

Congressional Action Needed

While the court of appeal erroneously interpreted the SCA in a manner that guts key privacy protections for billions of users, it did note calls on Congress to update the law. The SCA has been amended since its original enactment, but little has been done to address constitutional deficiencies, including Fourth Amendment issues that would have been fixed by the Email Privacy Act, First Amendment issues presented by gag order provisions that could be fixed by the NDO Fairness Act, and the due process and fair trial issues that gave rise to this matter.

There are many possible ways to address the issue of criminal defense access to exculpatory information that is currently out of reach, while also ensuring that the privacy of users (some of whom are going to be victims, and some of those minors) is honored. In most cases, counsel for criminal defendants, like any other litigant or prosecutor, will be able to seek information directly from users, leaving the provider out of the mix. This is relatively desirable since the user is in a position to assert objections and privileges, working out protective orders and other conditions of production, just as if the materials were in the user’s home.

For cases where that path and no other is available, Congress, after careful consideration, could select from a wide range of tools to fill the gap. For example, offered only as a thought experiment, Congress could create a new type of court order available for criminal defendants, with notice to the user with an opportunity to object, and protections provided by an in camera review by the court or a special master of produced material. Perhaps the prosecution could also be given the obligation, to the extent that ethical or other rules don’t already impose it, to seek exculpatory information, using a warrant under the SCA, with the obligation to disclose to the defense under the familiar Brady standards. No doubt there are many other possible approaches and variants worth deep consideration. The undoing of the SCA is not among them.

As the 40th anniversary of the SCA approaches, it is a good time to think seriously about bringing the statute into modern times, with an eye to what the next 40 years may bring.

Editor’s note: One of the authors, Richard Salgado, is an independent consultant on security and surveillance issues, including the Stored Communications Act, for Meta and others. He has also provided expert witness testimony, including on behalf of Meta. Salgado does not represent Meta or any party in this action and has not offered legal advice regarding this case. Also, Meta provides financial support to Lawfare. This article was submitted and handled independently through Lawfare’s standard submissions process.


Stephanie Pell is a Fellow in Governance Studies at the Brookings Institution and a Senior Editor at Lawfare. Prior to joining Brookings, she was an Associate Professor and Cyber Ethics Fellow at West Point’s Army Cyber Institute, with a joint appointment to the Department of English and Philosophy. Prior to joining West Point’s faculty, Stephanie served as a Majority Counsel to the House Judiciary Committee. She was also a federal prosecutor for over fourteen years, working as a Senior Counsel to the Deputy Attorney General, as a Counsel to the Assistant Attorney General of the National Security Division, and as an Assistant U.S. Attorney in the U.S. Attorney’s Office for the Southern District of Florida.
Richard Salgado was Google’s Director of Law Enforcement and Information Security for 13 years, and a federal prosecutor before that. He teaches at Stanford Law School and Harvard Law School, and provides consulting services on national security, surveillance, and cybersecurity through Salgado Strategies LLC.

Subscribe to Lawfare