Congress Cybersecurity & Tech Foreign Relations & International Law

What Happened to TikTok’s Project Texas?

Matt Perault
Wednesday, March 20, 2024, 1:24 PM

TikTok developed a plan to address U.S. government national security concerns, but it was dismissed without serious consideration. Why?

TikTok App (Solen Feyissa, https://www.flickr.com/photos/solen-feyissa/; CC BY-SA 2.0 DEED, https://creativecommons.org/licenses/by-sa/2.0/)

Published by The Lawfare Institute
in Cooperation With
Brookings

Fourteen months ago, in a conference room in TikTok’s Washington, D.C., office, TikTok CEO Shou Chew briefed a small group of think tank scholars, nongovernmental organization officials, and academics on a plan the company had developed to address the U.S. government’s concerns about national security risks associated with the product. They called the plan Project Texas. 

I was one of the academics in the room (the Center on Technology Policy at UNC-Chapel Hill, which I direct, receives funding from foundations and tech companies, including TikTok). As Chew and other TikTok senior executives walked through slides outlining the details of Project Texas, I typed notes as fast as I could. A few days later, Samm Sacks, a scholar at Yale Law School, and I turned those notes into a detailed overview of Project Texas, which we published in Lawfare.

At the time, neither TikTok nor the U.S. government had offered a public description of the plan. The result was a debate rooted in rumors. Our aim was to substitute detail for rumor but to leave an evaluation of the sufficiency of TikTok’s plans to others. We wrote: 

The debate about the risks posed by TikTok’s operations in the United States should be grounded in the realities of its current and planned operations, rather than in speculation about what those operations might be. Our hope is that with a more concrete understanding of the details of Project Texas, experts and policymakers will be able to debate whether TikTok’s plans adequately mitigate the risks. If not, we hope that they will offer alternatives that will enable both TikTok and [the Committee on Foreign Investment in the United States] to consider specific modifications that will protect Americans’ safety and security, protect users’ ability to use tech products they enjoy, and protect platforms’ ability to innovate.

How naive we were.

Since that briefing in January 2023, Project Texas has barely registered in the policy debate about TikTok’s future. When Chew testified in Congress in March 2023, the most significant discussion of Project Texas was a Texas representative’s criticism that TikTok used his home state in the plan’s name. In an attempt to signal that its promises are not empty, TikTok has implemented many of Project Texas’s features, including transferring U.S. user data to the cloud infrastructure of Oracle, a U.S. company. But the few lawmakers who have raised Project Texas have dismissed it out of hand, while typically neglecting to mention any specific feature that is problematic or identifying possible remedies to address deficiencies. 

Last week, when the House passed the Protecting Americans From Foreign Adversary Controlled Applications Act, a bill that bans TikTok unless it is sold so that it is “no longer being controlled by a foreign adversary,” it was clear that Project Texas has not had the impact on the policy debate that TikTok hoped it would. In retrospect, Project Texas was stillborn. 

Why? 

One view is that Project Texas did not sufficiently mitigate the risks posed by TikTok. Project Texas focused on addressing three main risks: first, that U.S. user data could be accessed by the Chinese government; second, that the Chinese government could influence content distribution on the platform; and third, that there would be insufficient transparency to understand and identify future risks as they arose.

To address risks of data access, Project Texas created a U.S. subsidiary called U.S. Data Security (USDS) to manage U.S. user data. USDS would store data on Oracle Cloud infrastructure, and the new subsidiary would be staffed by U.S.-based employees. Data stored in USDS could flow out of the United States in a limited set of circumstances, such as when a U.S. user messaged someone based outside the United States or posted a video globally. 

To address risks that the Chinese government could influence content, TikTok would house the key content moderation functions within USDS, including the Trust and Safety and User Operations teams. Oracle would inspect source code within USDS, and TikTok’s recommendations algorithm would be subject to review by third-party auditors. 

To address concerns that Project Texas would be insufficiently transparent, the plan included oversight and auditing features to help surface potential risks. In the briefing, we were told that seven entities would conduct oversight of various components of Project Texas, including the Committee on Foreign Investment in the United States (CFIUS), which is responsible for investigating national security risks of foreign investment in the United States; Oracle, the trusted technology provider; a source code inspector nominated by Oracle and approved by CFIUS to conduct an independent inspection of the source code; and a data deletion auditor to verify that all U.S. person data held on TikTok servers prior to the creation of USDS been successfully deleted. In addition, CFIUS would have broad authority to review the employees of USDS and to approve of auditors. 

Lawmakers quickly dismissed the idea that these features could meaningfully combat potential national security risks. Rep. Cathy McMorris Rogers (R-Wash.), chair of the House Energy and Commerce Committee, dismissed Project Texas as a “marketing scheme.” Rep. Frank Pallone (D-N.J.) called it “simply not acceptable.” Rep. Jay Obernolte (R-Calif.) said that it would not be “technically possible” for TikTok to do what it said it would do.

To some extent, these concerns are justified. A number of press revelations about TikTok’s conduct have fueled the perception of risk, such as reports that the company surveilled journalists. And the reality of the tech sector is that eliminating risk is an impossible fantasy. Companies that devote billions of dollars to security can still be hacked by foreign adversaries. Companies that carefully vet each employee can still be victimized by rogue employees. And companies that store sensitive data still suffer security breaches

Even if companies cannot eliminate risk, they can minimize it, and Project Texas was designed to do just that. Establishing USDS and staffing it with U.S. employees provided some insulation from foreign influence. Putting responsibility for storing and protecting U.S. user data in the hands of Oracle, a U.S. company, reduced the chances that data could surreptitiously flow to China. Allowing source code review and giving the U.S. government a significant monitoring role reduced the chances that TikTok could threaten U.S. national security without authorities ever becoming aware of the threat. According to TikTok, these features were developed in the process of negotiations with CFIUS, as a means of addressing U.S. government concerns.

Perhaps most importantly, U.S. law would bind TikTok to do what it said it would do: Once TikTok made public representations of its plans for Project Texas, it could be held accountable under Section 5 of the Federal Trade Commission Act in the event that it behaved contrary to its public statements. 

Whether or not these assertions about Project Texas’s potential impact on risk are accurate, perhaps the central promise of Project Texas was that it created a public policy canvas for a debate about the specific tools that could be used to combat national security risk. 

Were seven auditors insufficient? The U.S. government could demand that TikTok add three more. 

Was it problematic to permit U.S. user data to leave USDS in certain circumstances? The U.S. government could require USDS to create a version of TikTok that walled off U.S. users from the rest of the world. 

Did it seem unlikely that a single auditor could detect content moderation manipulation? Empower a broader and more diverse group of auditors—perhaps including researchers who have already demonstrated an ability to review TikTok’s code for foreign influence—to access and review the company’s code and content moderation practices.

Lawmakers never seemed to wrestle with these possibilities, with the House instead choosing to pass legislation that presumes both that TikTok is guilty and that no remedy is possible, aside from a sale or a ban. There is limited public evidence that lawmakers reached those conclusions after considering the alternatives discussed above. And despite repeated calls for national security officials to provide more public evidence supporting their assertions of the risk that TikTok poses, few specifics have been brought to light. 

There are clear downsides to abandoning an existing, deliberative policy process rooted in debating mitigation options in favor of creating a new governing authority that presumes the impossibility of any remedy aside from ban or divestment. First, existing authorities have promise, and pushing them to the side will weaken them in the long run. CFIUS is a process designed explicitly to investigate and remedy national security risks. Similarly, the Commerce Department’s Information and Communications Technology and Services Supply Chain Rule permits the government to review foreign transactions that might pose “undue or unacceptable risks” and to take action to address this risk. Skirting these established processes sends a signal that they are not capable of addressing the risks they were designed to address. 

Second, by creating broad new authority to restrict technology products, Congress is allocating significant power to the president that can be used at any point in the future on issues that have nothing to do with TikTok. Some lawmakers who support the Biden administration’s use of authority when it is applied to TikTok may be less supportive of future presidents using the same authority to force a ban or sale of other companies. 

Leaving Project Texas behind also has other costs. TikTok will inevitably challenge a ban or divestment order, as it has in the past. And courts may strike down a ban or divestment order as a violation of the First Amendment, as they have in the past. Some commentary has suggested that the U.S. government will easily be able to survive judicial review, since the government will be able to show that the national security risks justify the restrictions on expression. But the First Amendment equities here are significant. If that’s hard to see when TikTok is the target, it’s far more obvious if you consider a similar remedy applied to a different company. Could the U.S. government ban the New York Times unless its owner sells the company to a person based in Texas?

Even if a sale goes through, recent history suggests that it is challenging to use a sale process to eliminate Chinese ownership entirely. When a Chinese entity was required by CFIUS to divest its ownership stake in Grindr, subsequent reporting showed that some of the same Chinese investors were able to obtain an ownership stake in the new company. In TikTok’s case, the current debate seems to presume that one Chinese citizen owns the company, and divestment simply means transferring that ownership stake to one U.S. citizen. But of course, the corporate structure of any company, including TikTok, is far more complicated, with a vast number of shareholders owning a stake in the company. Several of TikTok’s current investors are American firms, such as General Atlantic. 

The new legislation also ignores the privacy elephant in the room: Congress has failed to pass federal privacy law despite pressure from consumer groups, civil liberties organizations, users, trade associations, and even companies. Even if the new legislation helps in some way to produce stronger privacy practices on TikTok, it will fail to protect users’ online privacy when they use any other tech product. As Rep. Maxwell Frost (D-Fla.) said, “We need to regulate social media—not just TikTok—but all of it.”

The absence of privacy legislation may also be a factor that will weigh in TikTok’s favor in the First Amendment analysis. One component of judicial review could be whether the restriction on speech “burden[ed] substantially more speech than is necessary to further the government's legitimate interests.” To the extent the government’s concern is how a company collects, processes, and transfers data, a federal privacy law could advance that interest without imposing as significant a restriction on users’ speech.

Finally, China may perceive the legislation as a threat, and typically, when China views U.S. government action as a threat, it retaliates. China could frustrate the purposes of the legislation by using existing export and antitrust authorities to block TikTok’s sale to a U.S. entity, or retaliate by imposing restrictions on other businesses operating in China. U.S. chip companies have already been targeted for retaliation in response to U.S. export restrictions, and China might take a similar approach in response to steps taken against TikTok. At a time when U.S.-China tensions are high and U.S. tech companies are actively competing against Chinese companies, retaliation could undermine U.S. geopolitical and economic interests.

TikTok’s future is uncertain. The new bill roared out of the House and now awaits consideration in the Senate. It is not yet clear if the Senate will pass it, or if Majority Leader Chuck Schumer (D-N.Y.) will even bring it to the floor for a vote. If the bill passes, the president has indicated he would sign it, but even if he does, it will get tied up in litigation, likely for years. It is unlikely TikTok influencers will lose their influence at the hands of the U.S. government any time soon.

What is far more certain is that Project Texas will not fulfill its promise as a vehicle for an active debate on the policy mechanisms that could mitigate the national security risks associated with foreign-owned technology. And that, to borrow Rep. Pallone’s phrasing, is simply not acceptable.


Matt Perault is a contributing editor at Lawfare, the head of AI policy at Andreessen Horowitz, and a senior fellow at the Center on Technology Policy at New York University.

Subscribe to Lawfare