Lawfare Daily: Understanding the DC Circuit Court's Decision on TikTok
Published by The Lawfare Institute
in Cooperation With
At a virtual panel conversation co-hosted by Lawfare and NYU's Center for Technology Policy, center Director Scott Brennen moderated a conversation between Lawfare Senior Editor and University of Minnesota law professor Alan Rozenshtein, University of North Carolina law professor Mary-Rose Papandrea, and Georgetown law professor Anupam Chander, about the recent D.C. Circuit decision upholding the TikTok divestment-or-ban law and what that means for the future of both TikTok and the First Amendment.
To receive ad-free podcasts, become a Lawfare Material Supporter at www.patreon.com/lawfare. You can also support Lawfare by making a one-time donation at https://givebutter.com/
Click the button below to view a transcript of this podcast. Please note that the transcript was auto-generated and may contain errors.
Transcript
[Ask Us Anything Ad]
Anna Hickey: Every news alert in 2024 seemed to bring new questions, but fear not, because Lawfare has answers. It's time for our annual Ask Us Anything podcast, an opportunity for you to ask Lawfare this year's most burning questions. You can submit your question by leaving a voicemail at (202) 743-5831, or by sending a recording of yourself asking your question to askusanythinglawfare@gmail.com.
[Intro]
Mary-Rose Papandrea: I think the government here did at least a convincing job to this panel of emphasizing the danger of TikTok, China, social media, privacy. You know, we've got to act. And the court said, okay, you got it.
Alan Z. Rozenshtein: It's the Lawfare Podcast. I'm Alan Rozenshtein, associate professor at the University of Minnesota Law School and senior editor and research director at Lawfare.
Anupam Chander: But it's also true that covert manipulation by China is occurring in all the apps. Okay, so covert manipulation is occurring everywhere. So now the question is this place uniquely subject to this? And then the question then is, what is our response?
Alan Z. Rozenshtein: Today we're bringing you a recording of a virtual panel, co-hosted by Lawfare and NYU's Center on Technology Policy, about the recent D.C. Circuit decision upholding the TikTok divestment-or-ban legislation.
Scott Brennen, the Center's Director, led a conversation between me, UNC Law Professor Mary-Rose Papandrea, and Georgetown Law Professor Anupam Chander, about what the decision means for TikTok and the First Amendment.
[Main Podcast]
Scott Brennen: And, on last Friday, the D.C. Circuit Court of Appeals upheld a law that will ban TikTok in the United States on January 19th, unless ByteDance, its Chinese-based owner, sells the company to a qualified buyer. How did the court make the decision? How did they approach First Amendment issues? But more importantly, what happens now? What might TikTok do?
So I want us to really dive into the decision, but before we do that, you know, I want to just sort of set the scene a little bit on how did we get here? So, Anupam, maybe give us just like two minutes, brief history of the government's efforts to ban TikTok.
Anupam Chander: So the government's efforts to ban TikTok, of course, go back to 2020 when Donald Trump, in the middle of the summer, just a couple months before the election, announced that he was going to ban TikTok.
And so, he gave them 45 days, that was the first executive order, and declared it was a national security emergency, and it should be banned in 45 days. And then there was an accompanying order that came shortly thereafter, on a different basis from the Committee on Foreign Investment in the United States, where the president declared that it would be required to be essentially sold, divested, through that investment review process.
Of course, what happened then is that TikTok and TikTok users sued. They sued in Washington, D.C., and they sued in Pennsylvania. The Pennsylvania suit is funny because TikTok influencers were seeking to vindicate their First Amendment rights to influence. The trial court there said, yes, indeed, you have First Amendment rights to influence that are being abridged by this. So, in that case, they actually didn't review the First Amendment issues directly, but they recognized that there was some First Amendment concerns there.
But that case, like the D.C. Circuit case, fell based on the statutory exception for controls preventing the president from controlling the flow of informational materials across the borders. And so the courts in those cases said, hey, look, you don't have this power under what Congress has done.
So that brings us to 2024, when again a new administration, the Biden administration, now seeks to ban TikTok, working in cahoots of all things with the House Republicans. And so the House Republicans also want to ban TikTok, or at least the Select Committee on the Chinese Communist Party in the House, wants to ban TikTok. And so now if we didn't find, if courts found, that there was not a statutory power, well, maybe we can give the executive statutory power to do so, and that's what, in March, the House passes a bill, the Senate is resistant. Then the House repasses that bill to ban TikTok, to require a sale or a ban, as part of a general aid package to Israel, to the Ukraine, to Taiwan, and also holds in a data broker bill law at the last minute that kind of bans very limited amount of data brokerage activity vis-a-vis these foreign adversary countries that are named in the bill.
So, that's the TikTok ban divest-or-ban law in April. And of course, TikTok then files lawsuit, the TikTok users file lawsuits. They are confined to the D.C. Circuit Court of Appeals as the court of original jurisdiction in the case. Per the law, the law says challenges can only be brought there. That's what gets us to the D.C. Circuit ruling now, last Friday.
Scott Brennen: Amazing. Alright, so, as I said, I want us to really, like, dive into the decision. And, in particular, on the sort of First Amendment side of things. So, Mary Rose, I want to start with you. The decision weighed in, the court weighed in on TikTok's entitlement to First Amendment protections.
Why don't you tell us a little bit about what the court said there?
Mary-Rose Papandrea: Yeah, sure. So, let's first mention at the outset that the way the opinion was structured. So, we had three judges, of course for the panel, and two of the judges were part of the majority, and then judge, and the majority was written by Senior Judge Ginsburg, and the concurring opinion was written by Chief Judge Sri Srinivasan. And so I just want to highlight that because I suspect later we'll come back to talk a little bit more about some of the different arguments that Judge Srinivasan made in his concurrence.
But in the majority opinion, the first thing to know is that the majority rejected the government's argument that TikTok was not entitled to any First Amendment protection at all. Judge Ginsburg said that TikTok was a domestic corporation, and said that, and operating domestically, and said that the government also had failed to present any evidence that would justify veil-piercing. And so that was fascinating right there. I will say that the whole opinion thereafter is infected with concerns about China.
The majority, and we'll talk more about this, very much bought into the government's concerns that China, notwithstanding all the protections and structures that TikTok had arranged that nevertheless TikTok could not be trusted, because of China. And we'll see in Judge Srinivasan’s concurrence, he is not quite as sympathetic to TikTok’s arguments that it isn't entitled to First Amendment protection.
But that, I just wanted to say at the outset, there are… This was a losing opinion for TikTok, but it did win certain things. And that was the first thing that it won, was that it was entitled to some First Amendment protection at all. Because, just to state the obvious, if it weren't, that would be the end of the First Amendment challenge. Maybe we would get to Bill of Attainer or something. I'm not sure what would happen then.
Okay, so that was an important threshold requirement. Then, the next threshold requirement was what tier of scrutiny or what kind of scrutiny. The users, my understanding is they had argued for a prior restraint standard. The majority, unless I missed it, does not give any attention to that argument. Instead, the majority talks about whether it should apply strict scrutiny or intermediate scrutiny.
So strict scrutiny, of course, would require, is the highest level of scrutiny where you need a compelling government interest and the law has to be narrowly tailored to serve that interest. And with intermediate scrutiny, the government has to have an important government interest, and the, the government interest has to be unrelated to the suppression of speech.
Judge Ginsburg says that this was the, deciding which tier should apply required the application of unclear Supreme Court precedent to a novel set of facts. And he ultimately says you just can't decide which one it should be. On the one hand, the law is content neutral in that it doesn't specifically call out certain types of content to be prescribed. On the other hand, it does, of course, target TikTok, a particular speaker, and so that would be a good argument for strict scrutiny. Ginsburg concludes, regardless though, which level of scrutiny you applied, that TikTok loses because the government has satisfied that heightened strict scrutiny standards.
So I'll just stop there so I'm not talking for an hour. But those were two of the most important threshold issues in this case.
Scott Brennen: Great. As far as the threshold, or the level of scrutiny, Mary Rose, as you mentioned in Justice Srinivasan's concurrence, he actually came to a different conclusion on the appropriate level of scrutiny. And my understanding, I should have said at the beginning, not a lawyer, so feel free to correct me when I'm grossly wrong on any of these points, that he felt that intermediate scrutiny applied here.
My question here is, like, does it matter? Why does it matter? Why is this, why does this matter enough to write to spell out in the, in a separate concurring opinion?
Mary-Rose Papandrea: Okay, well, first, I'm going to attack your premise that you are not a lawyer. You do a good job playing one. So anyway, but, but I would say well, the concurring opinion is full of very interesting arguments that could end up playing an important role before the court.
Just to flesh out how Judge Srinivasan reached a different conclusion: first, he says that if you take the two government interests that are asserted, one is to protect data and another is to prevent China from covertly manipulating the content that users see. He thinks that, I should have mentioned the reverse order, the one about content moderation, he places that activity in China and says that there is no First Amendment protection for that activity. So that is you know, fascinating right there, how the court, if the Supreme Court does take this case, how the court or en banc I suppose, first, whether the court, a subsequent court, would take that view of bifurcating the government interest in placing one activity in China and the other, the data protection.
So, Judge Ginsburg was not willing to put the content moderation activity in China. And so he says, even though, Judge Ginsburg, even, in the majority, even though the content, the data protection part does appear to be content neutral under any real reading, that it was the law is infected with this content moderation justification, and so you cannot so readily disentangle them.
For Judge Srinivasan, because he is able to focus solely on the data protection side of things, he is more readily able to apply an intermediate level of scrutiny. And for, you know, why he writes that opinion, I mean, this is something, first of all, he just disagrees. He has very good reasons. I'm not weighing in one way or the other on whether I think he was right, but it's a very thoughtful opinion. This is a very difficult case. And I think he probably guesses this isn't the last we'll hear of this decision, whether it goes en banc or to the Supreme Court, so he wanted to get his thoughts out.
Alan Z. Rozenshtein: Yeah, just to quickly follow up on that. Yeah, I agree with Mary-Rose. It's hard to know sort of why any particular judge writes any particular opinion. It's a little ironic that Judge Srinivasan wrote this opinion because in some ways it's worse for TikTok, because it actually applies a lower level of scrutiny. But he was, I think, by far really the only judge who was even remotely sympathetic to TikTok's position at oral argument.
You know, I think what may be happening here is kind of an intra-judge or inter-judge debate about how to apply tiers of scrutiny generally. I think I suspect that maybe what Judge Srinivasan is worried about, is that the majority's application of strict scrutiny is a little watered down. And he does not want that to become how strict scrutiny is used in the future. He's not particularly sympathetic to TikTok, so he has no problem if TikTok loses in this case. But he would like to, he would like it to lose under a lower standard. So that in the future, when he actually wants to rule for a particular plaintiff, and he wants to invoke strict scrutiny, he can say strict scrutiny matters a lot. It's a very demanding standard.
And so I think what might be happening is that and this is why he incurs in the judgment, he's happy for TikTok to lose, but he'd rather it lose under a lower standard to preserve the potency of the higher standard for future cases. Now this is, you know, if you're a constitutional law scholar, this is actually very important. If you're TikTok, it's totally irrelevant. Because at the end of the day, you're losing no matter what. So this is why I think we should just view this opinion as a 3-0 demolishing of TikTok. It's kind of, I think, the between the lines or overview of it.
Scott Brennen: Right. Cool. Well, I want to dive more into the, to their actual analysis and their strict scrutiny. So, my understanding in order to pass strict scrutiny, you have to show that both that the government has a compelling interest, and that the provision is narrowly tailored. So I'm going to break these kind of apart and look specifically at both sides of this.
So, on the compelling government interest that if the law furthers a compelling government interest, the decision observes that there are really two different compelling government interests. One, Mary Rose sort of said this already, one relates to data collection, the other two covert content manipulation. So I wanna start with the data collection. I wanna really try to identify what exactly is the concern here, and what exactly did the court say?
Anupam Chander: Sure. So, the court followed the government, which said essentially that there were a lot of concerns that Chinese government might be able to access data of Americans via the app. It cited a variety of sources that said, hey, data can still arrive in China, even under the National Security Arrangement that TikTok has tried to enter into with the U.S. government, and therefore, the app just can't be made safe in this way.
Now, I'll leave my criticisms to the side. Mind you, much of this criticism could be said of Facebook, Twitter, et cetera. There's no way to secure all the data from arriving in China. So, there's open questions about all that. And there's a huge number of protections built into the National Security Arrangement that TikTok tried to enter into that aren't available with any other service existing in the United States. And so, I think there was a lot of concern.
The examples used here were, the government used an oral argument where, hey, you might be able to identify possible spies so the Chinese government will be able to target some people as particularly, kind of, useful assets in a future you know, spying operation. Now, I might say you might be able to use TikTok to do that, even if you don't own TikTok, because you can probably see what people are doing on the app in a variety of ways, because a lot of what people do on the app is public. You just can't see their strolling behavior. So it's a very limited dataset that is not available to any third party who wants to use the app.
So I'll leave it at that, but there's a lot more to be said about that.
Mary-Rose Papandrea: Am I wrong or was there also some suggestion, it may have been in Srinivasan's opinion, that there are children now who might become government leaders later and that they could be blackmailed or something like that.
Anupam Chander: The blackmail thing is, you know, one of the key questions throughout, the government has been saying this blackmail possibility.
And I'm not even sure what exactly that blackmail stuff is, because if kids are doing things on the app, that could be seen by someone else already. So the app is mostly a public facing app, so it's really just the browsing behavior that you might have on the app that's potentially private information.
The opinion mentions search histories and browsing history. What it really means is searches within the app and browsing within the app not your Google searches, etc. In 2020, that was kind of intentional obfuscation by the Trump Justice Department to talk about search histories and browsing histories.
In 2024, the Justice Department avoided that language. And so, because, it really lent itself to this misinterpretation of what exactly that meant. But the court unfortunately cites the 2020 arguments of the Justice Department ban.
Scott Brennen: So this is sort of a theme that's going to kind of run through the conversation, I think. But my understanding is that we don't actually have any public evidence of this blackmail happening, or TikTok data, user data being used to support intelligence operations in the U.S.
Anupam Chander: We already know that the government says, even in the private data that has no evidence that China is manipulating anything on the app or the Chinese government is getting information on the app.
They, and so the government's own filing makes that clear. The government says: we have no information that there is this manipulation, either to surveil or to promote propaganda. And so, that means that even in the secret information that we don't have, that's not there.
Scott Brennen: In the classified briefings that the lawmakers had before they passed it. Yeah, Alan, you want to weigh in?
Alan Z. Rozenshtein: Yeah, no, so I think Anupam is right about that. And, you know, if there was this information, we would know about it one way or the other. Though it is notable, I just think, that the court goes out of its way to say that it's ruling entirely on the public information.
Maybe that's because the public information is damning enough, or maybe because the classified information was actually a bit of a dud. We don't know, obviously. You know, I think this gets at another theme of the opinion, and that is, you know, how much evidence, how much concrete evidence does the government have to show to establish its concerns, either about the data privacy concern or about the content manipulation concern?
And the court and I'll put my cards on the table, I think correctly, though obviously others disagree, says, look, we're not going to require the government to show evidence of the specific harm already happening before we're going to let the government take actions here. And I think part of this is sort of a theoretical point about when the government should be able to act. I think part of this also is just, you know, the fact that the judges are people too, and they read the news.
And they are aware that although China may not be doing the very specific thing that the government, like right now, that the government is concerned about, in the past it has done very similar things and it clearly has the capabilities to do the thing. You know, so, you know, I like to joke, it's not a joke actually, I like to point out that I have three or four lifetime, you know, credit monitoring systems in place because of the sheer number of times the Chinese government hacked my data when I was in the government.
And so, you know, the government nicely gave me a lifetime credit monitoring voucher, which is great, I guess, you know, and similarly, you can look at sort of other situations in the past where China has gotten very allergic about how this or that issue is portrayed on the internet and has taken unbelievably heavy handed actions with respect to foreign companies or its own companies.
Which is to say, you know, if you're TikTok, you're going to say, look, there's no smoking gun here, which is true. And if you're the government, you can say, okay, well, there's no smoking gun, but there's a gun. It's on the table. It's loaded. It's pointed at us. And we know that the shooter has shot the gun in other circumstances. So do you want, we want us to wait? This is not, I think, an empirical question at the end of the day. Both sides are correct here.
The question is, for a legal, as a legal matter, how much do we want the government to have to wait? And you know, in a case like this where the political branches, and Anupam is right, you know, the Senate resisted a little bit and, but at the end of the day, right, this was quite a bipartisan, quite an overwhelming, you know, at least as far as bipartisanship goes in D.C. in, you know, 2023, 2024, this was quite bipartisan and overwhelming. How much are we going to prevent the political branches from acting as the national security people say, to the left of boom?
And at least in this case, the court clearly was willing to say, look, there's enough evidence here that we're not going to wait for the actual thing to happen because then it will be too late. And we don't want to take, frankly, we don't take responsibility for that, which is, you know, really what's happening in most of these national security cases.
Scott Brennen: Yeah, Alan, I think that's a really excellent point. I think at one point the decision cites case precedent that, you know, and they said like the government need not wait for a risk to materialize before they act on it.
But, if I'm understanding your argument, it seems to hinge on there being a very direct relationship between TikTok, ByteDance, and PRC. Right? Like, if we have some clear sort of indication that the Chinese government has engaged in problematic surveillance activities through similar data sets, if, right, if we're going to, right, like, a need to act on an immediate threat, that presumes that there is a direct connection between TikTok and the Chinese government.
So what do we, I mean, is that right? So what do we know about that connection? I guess, that does all lead up to the to yeah, this question.
Alan Z. Rozenshtein: Yeah, so I think a lot depends on what you mean by direct, right? And again, a lot of this is how you frame, how the advocates frame the story, right? If you're TikTok, you're going to say, first of all, we're TikTok, we're a U.S. company.
Sure, we're owned in some way by ByteDance, but ByteDance is a Cayman Islands company, right? And even if ByteDance, you know, 21 percent of ByteDance is controlled by a Chinese national, that's a person, that's not owned by the Chinese government. And so you have to layer on these many, many levels, right? And the government hasn't shown that. Okay. So that's what TikTok is going to say. And all of that is true.
The government's going to say, are you kidding me, right? There is no such thing as a private company in an authoritarian police state like China, right? That's actually, there isn't even anything like that in Chinese law. Because under Chinese law, every private company is, it's not quite state-owned but it has obligations to the government, of which there is no parallel, right, in a liberal democracy.
And also who cares what's in Chinese law. I mean, are you kidding me? Does anyone really think that if Xi Jinping didn't want to make ByteDance do something, ByteDance, you know, he'd be stopped by that? And as to the relationship between ByteDance and TikTok, there's plenty of public reporting that's shown that there's a much more porousness, let's say, between TikTok and ByteDance on the engineering side, on the policy side, on the content moderation side, than TikTok would like to admit.
And so, if you're the government, you're going to say, that's enough of a direct connection or a direct line of influence given the national security interests at stake, right? So again, it's not that either side is wrong here. It's just that there's a value choice here that you just have to decide one way or the other as to, you know, what are you going to, how much directness and imminence are you going to demand of the government? This is true for the data privacy concern. It's true for the content, the content manipulation concern. It's true for the mitigation and Project Texas, which we can get to. It's true of all of that. And it's just at the end of the day, the court rules one way versus the other way.
Anupam Chander: So, a number of things. First, there is a loaded gun. That's the framing, right? And Alan knows as well as anyone else, he just told us, he has been subject to Chinese exfiltration attempts. That is on American computers, on American servers, on American companies. So, no one is saying that China is not engaged in attempts to surveil and propagandize the American population.
By the way, the United States is engaged in that surveillance attempt with respect to China as well, right?
Alan Z. Rozenshtein: I certainly hope so.
Anupam Chander: Yeah. So by the way, it's also engaged in fake propaganda efforts. Look at the Reuters report that I want everyone to Google: Reuters United States Defense Department and Chinese vaccines in the Philippines. So we have been engaged in foreign disinformation campaigns.
Okay, so be that as it may, the U.S. and China being engaged in these massive attempts to infiltrate each other's networks. By the way, Snowden revelations said that we infiltrated the Tsinghua University networks. Okay, that is, imagine China having infiltrated Harvard University's networks in the United States. So, lots of efforts all over the world.
The question is whether TikTok is the right venue for this national security vector, this threat vector, and it really is such a bizarre space for us to focus on. If you look at Mike Gallagher, what Mike Gallagher said was that he was less concerned about privacy or data exfiltration. He was mostly concerned about propaganda. That's what he told the New York Times, okay? So the whole thing is actually much more focused on the propaganda portion of this. And if you hear congressmen again and again, during the course of this, not stray comments as the D.C. Circuit erroneously characterized them, as they explained what they were worried about. They explained that they were worried about the propaganda: what TikTok was teaching American kids in particular.
Mike Gallagher in November 2023, came out and said, hey, look, you know, as an official pronouncement from the, that select committee on the Chinese Communist Party, he said, American youth are siding with the Palestinians. Not the Israelis, despite the terror that Hamas has wreaked on Israel. Why is that? It's because they watch TikTok. And why does that matter? Because TikTok is controlled by the Chinese Communist Party and it's pushing the Chinese Communist Party agenda.
That was the main motivation, if you look at the way that this got over the hump in March and April. And so this is a law that is focused on the fear that Americans will receive Chinese Communist propaganda that is already pushing children in one direction. And so, I think we should be, we should really recognize that huge, and that is of course, like, if there is anything more than a red flag for the First Amendment, that should be a red flag for the First Amendment.
Mary-Rose Papandrea: Not to pile on Alan because that's not my intent, but I wanted to make two points and just to just build on what's already been said.
So, the first is that the rationale for going after TikTok is just very sloppy and it's an amalgam of a whole bunch of things. Part is just social media generally. There's a lot of anxiety about what social media is doing to young people. So that's part of it. Part of it's this data, like, like not entirely thoughtfully expressed exactly what China would do with the information that isn't already public, or exactly why the government doesn't trust the TikTok-U.S. data security entity that was set up to manage data security.
You know, the D.C. Circuit just over and over just says they don't, the government doesn't trust China. And that gets back to my first point that it's just, you know, you get social media, you get data privacy, you get China, and it's just like this big ball of concerns. But typically when you see strict scrutiny, the government should be required to articulate, at least, the compelling interest with more clarity. Often there is deference given to the government on whether there is some sort of danger, like in the Holder v. Humanitarian Aid case, but this opinion was somewhat remarkable in the dramatic amount of deference.
So that brings me to my second point, which is that, because it is national, as soon as you start focusing on national security and China, which is clearly one of our, well, I don't know as a matter of law, but you know, our enemy right now, that we don't trust China, that Congress doesn't trust China, that the court is deferring and says, over and over again that this is a matter of judgment for the, you know, experts, Congress and the president, notwithstanding that Congress didn't make factual findings as it might usually do, notwithstanding that there's some confusion exactly on why it was passing this law.
Notwithstanding that the dangers are speculative. And I'll just say it's similar for those listeners who are familiar with FOIA, the D.C. Circuit, here's a ton of FOIA cases we have seen from quite often the court, the same court, not necessarily these particular judges, I'm not saying that, but just in general, deferring to the government's claims that releasing certain types of information could cause harm, even when those, that harm is very speculative.
It's just we don't have, and Judge Ginsburg says this specifically, the limit, you know, the courts have limited competence to judge the national security risk. So I think the government here did at least a convincing job to this panel of emphasizing the danger of TikTok, China, social media, privacy, you know, we've got to act. And the court said, okay, you got it.
Scott Brennen: Yeah, Mary-Rose, your point about the sort of sloppy justification I think is well taken, but I do want to, like, really try to zero in on exactly what the decision claims and finds as it concerns a compelling government interest. So I want to come back and really focus on what Anupam brought up, which is the second sort of compelling government interest around covert content manipulation. The concern about covert content manipulation, which, right, which would be the prop on the propaganda side.
So I think to me, what really stood out here is the moment when they said it's less, they're less concerned about content manipulation as much as they are about it being covert. So, I don't know, Alan, if you want to, I saw you nodding, so if you want to weigh in here, help me make some sense about what's going on here.
Alan Z. Rozenshtein: Yeah, so, so I think to me the most striking part of this opinion, which is related to the point, is when the court argues that one of the reasons that this law is okay, is because it's actually furthering First Amendment interests, which, might sound a little Orwellian but I think it's worth unpacking.
So, this case presents itself most naturally as a case pitting national security versus the First Amendment. And obviously, in a large part, that is what this case is about. But, I've always thought, and I was pleased to see this in the court opinion, that this case is more complicated than that. Because there is a kind of intra-First Amendment issue here, or maybe an intra-free speech, free expression issue, right? Which is that, on the one hand, you have the free expressive interests of 170 million Americans to use the platform of their choosing. And that's about as weighty of a free expression interest as you get.
On the other hand, the concern with respect to content manipulation, and specifically covert content manipulation, is that you are going to have a potential for a foreign government to distort the very free, expressive environment that TikTok is claiming to uphold. Now let's put aside this question of whether that concern is speculative, right? That's a different issue. But let's assume for a moment, that is in fact going on, right? Or that would go on, for example, if the U.S. got into a shooting war with China over Taiwan, let's say, right?
So there, I do think you have two free expressive interests butting heads a little bit. You have the expressive interest of people who want to use the platform they want to use. Then you also have the free expressive interest of a platform and a speech environment not being distorted. Now, this distortion concern used to be a really important part of First Amendment doctrine, right?
Courts used to care. They used to credit arguments from the government that it, that the government can sometimes intervene in sometimes very heavy-handed ways in a speech environment to prevent distortion. This was especially true with campaign finance regulations. In fact, the entire campaign finance infrastructure, or the campaign finance law infrastructure, is, kind of, based on this assumption that sometimes the government should be able to limit the speech of some, let's say, the very powerful, the very rich corporations, in order to create, on the whole, a better speech environment, right?
Now, the court very much turned away from this in cases like Buckley v. Valeo, and then maybe most infamously in Citizens United. But there's actually always remained a anti-distortion rationale running through the court's opinions, especially, perhaps exclusively, when it comes to foreign influence.
And so, I think the very year after Citizens United, there was another case, and I apologize, I forget its name, but my esteemed panelists who know more about the First, who have forgotten more about the First Amendment than I will never know, should just, you know, correct me, out of the D.C. Circuit in which then Judge Brett Kavanaugh upheld a very Citizens United-like restriction on the ability of foreigners to contribute to U.S. campaigns, and the Supreme Court actually summarily affirmed that, right?
And so I tend to think that the anti-distortion rationale for government intrusions into otherwise First Amendment productivity is underrated. I think it's a good thing. And so I'm always happy to see a court be willing to resuscitate that or at least keep it alive.
Now, it's true that it's unfortunate if it is only limited to foreigners, right? Because you can have distortion, not just from foreigners, right? You can have distortion within the United States But I think what's happening in this case, is the court is taking very seriously that there's an anti-distortion rationale for what the government is doing. And therefore transforming what would otherwise be a pure national security versus First Amendment case, into a kind of battle within the family itself, right, between different elements of the First Amendment. And I think that's a big reason why the court is comfortable. And that's why the fact that the concern about constant manipulation being covert is so important.
This is why I think the court, correctly, does not think that the Lamont case, which is a case from the 60s or 70s in which the court, correctly in my view, struck down a law that would have made it marginally harder to get, coincidentally, Chinese propaganda into the United States. That's why the court says this, that's not, this case has nothing to do with that precedent.
Because in that case, it wasn't covert that you were reading Chinese propaganda. You're reading literally what was called the Peking Review. The concern here, again, is not Chinese propaganda. The Chinese continue to propagandize on TikTok after the divestment. They can have a Ministry of Foreign Affairs account. What they cannot do, right, what they cannot do is they cannot covertly change, and opaquely, change TikTok to further Chinese geopolitical interests.
Mary-Rose Papandrea: It would be fun to talk a little bit more about this interest, and I think Alan is exactly right to highlight this. It is fascinating, and that part of the court's opinion where Ginsburg says this is actually furthering First Amendment interest is amazing. And so, I would like to think more about how is it that the covert, if it is really, and that, that was a big move to, to limit it to covert manipulation. So overt manipulation, I suppose, is not as a problematic, that, put that to one side, why covert is, is more problematic.
Then we have to start thinking about, do we not trust listeners and people who are receiving that information in the United States to evaluate and act. I don't know what the, you know, be able to evaluate that information, just as we all have to do on a daily basis as we are besieged with all sorts of false speech and misinformation and disinformation. We are constantly navigating that.
So I think it's worth thinking a little bit more, and I don't have an answer to this, but I don't think it's quite as easy to just leave it at, oh, there's covert manipulation, therefore that is an evil, and think a little bit more about why is that an evil? And are we willing to credit that? What does that say about our views on, well, just say Americans, not that users in the U.S. are all Americans, but let's just say Americans on their ability to deal with this covert manipulation. What exactly is happening?
And then just a little side note is that, and I think it just has to be mentioned at some point in this hour, that so much on TikTok has nothing to do with China or Taiwan or any other geopolitical issue. And has to do more with ridiculous products and cat videos and all sorts of things where there's, I don't see much of any danger, like any real danger, like, oh, I'm going to watch too many cat videos. You know, you'd have to be more specific about what exactly is that danger. So, again, this isn't to disagree so much with Alan, just to say, be worth thinking more. And I appreciate Alan highlighting that very interesting part of the court's opinion.
Anupam Chander: And let me just also extend Alan's argument in a particular way.
He pointed out that even after any divestiture, you can still have Chinese propaganda through, you know, Chinese accounts, et cetera on this app. That's, of course, true on all the apps, etc. But it's also true that covert manipulation by China is occurring on all the apps. Okay, so covert manipulation is occurring everywhere.
So now the question is, is this place uniquely subject to this? And then the question then is, what is our response? Okay, so I think Mary Rose has rightly pointed out that, you know, what is the question of paternalism here? We don't normally tell Americans what they aren't allowed to hear. Maybe we'll require labeling, you know, maybe the, you know, we tell Americans, hey, this is a you know, website or an app brought to you by a company that is, you know, ultimately owned by another company that is headquartered in Beijing. Fine.
But we know that what happened in 2016 is that the Russian Internet Research Agency covertly manipulated Facebook. We didn't end up requiring Facebook to then turn over the keys or stop operating, in order to protect the First Amendment rights of Americans from distortion from abroad, right? And furthermore, finally, the court in this case really ignored, really very much neglected, the immense protections of the National Security Arrangement that TikTok and the U.S. government had negotiated for many years, and then simply said, oh, that wasn't safe enough without telling us exactly what they wanted.
So they point to the fact that the National Security Arrangement allowed for some data to flow to China, but that was largely because first, you need to know some information about someone to figure out whether they're American. Okay, so you need to have data flow. Let's say you're opening the TikTok app in China or something. You need to make sure that the person's American or not. That means that data has to flow to that place to identify the person as American.
Two, you might have scams that are global in nature and you need to be able to collect that information for anti-fraud purposes. It was very clearly, specifically hammered out, negotiated with the U.S. government what exactly those, you know, channels of data flow to the foreign world would be. And so, instead, the government and the court simply say, oh, it didn't prevent data from flowing to China.
You know what? Your data is flowing to China on Facebook, on Instagram, on everything. The way to deal with this national security problem, ultimately, that the court agreed to, is to shut down the internet generally. Everything on the internet is in danger of being overtly manipulated or surveilled. You know, we know that this network might be hacked by the Chinese government. We saw AT&T and the other telecommunications networks you know, revealed to be hacked by the Salt Typhoon Network last week.
So, this is, the choice of TikTok, the targeting of TikTok in this case, you can spell out a national security threat. That's what the government did. That's what the court deferred to. We really have to say that this is a peculiar place, and I think it's just been a huge distraction by the Biden administration, by the Justice Department, by the intelligence services for many years. Instead, hey, why don't we lock down our telecommunication networks so they're not being hacked by the Chinese government?
Scott Brennen: So I do want to make sure that we have time to talk about what happens next for TikTok, what happens next, potentially in the Trump administration, but really quickly, Anupam, I think you're really gesturing towards the second part of their analysis, right? In terms of is the law narrowly tailored?
And that seems, their analysis seems to hinge on how Project Texas, which is the national security arrangement that TikTok had proposed and had apparently started to enact, fails to address the compelling government interests. And you already sort of mentioned how the opinion sort of says that Project Texas fails, but I want to make sure that we, that because this is so sort of central to the argument that the law is in fact narrowly tailored, that we address this. So, is there anything else that we should know about how the court thinks about Project Texas?
Anupam Chander: So let me just also just explain Project Texas to people. It means that Americans data, the algorithm, are held on Oracle's servers. They're compiled and deployed on Oracle's servers, vis-a-vis Americans. And they are held by a company. They're held by Oracle servers, but controlled by TikTok Data Security, a new company, which is run by three people who are subject to Joe Biden, or in the future, President Trump's personal approval.
And so, that is the whole management system of the app and the data is subject to people who are supposed to have effectively national security clearances running your data. So, in order to exfiltrate that data, as the government worries, you have to now lift it out beyond this incredible array of national security constraints.
So, they talk about social security numbers being leaked to China, etc. They don't mention that it was, there was a business operation simply, because it was a list of the top creators on the app. Which, you can imagine in a large company you might want to know where, how this company is making money and that might have included social security numbers of those top creators.
But that doesn't mean that, you know, my social security number is not even known to TikTok, etc. And so, the way that it kind of generated fear, that the government generated fear and the court adopted that provocation, is really problematic. And it's far from strict scrutiny, it's far from kind of narrowly tailored in this case.
Mary-Rose Papandrea: I would agree with Anupam that the court of course says it's applying strict scrutiny as a you know, saying the law does meet strict scrutiny even if intermediate ultimately is the right level of scrutiny and it is incredibly deferential. The mechanisms that TikTok has put into place through negotiation, extensive negotiations with the United States would seem to obviate most of these concerns.
But again, just going back to our earlier discussion, the court says that, the government has said it does not trust TikTok, it does not trust ByteDance, does not trust China. So, notwithstanding these mechanisms, it just doesn't really matter.
Scott Brennen: Alan, what happens next? You wrote a great piece for Lawfare recently about what happens if this decision doesn't go TikTok's way. What might the Trump administration do? What might the Supreme Court do?
Alan Z. Rozenshtein: So, let's talk about the Supreme Court and then we'll talk about Trump. So, TikTok has already said that it's going to appeal to the Supreme Court. The Supreme Court does not have to take the case. I suspect they will, just because this is really important. The Supreme Court seems to care a lot about digital First Amendment issues. I suspect that they would want their word to be the last word on this. So I expect the court will grant cert in this but we will see.
In the meantime, TikTok really needs someone to stay the law, because the law goes into effect on January 19th. I suspect TikTok will find someone, whether it's the D.C. Circuit or the Supreme Court. I suspect, out of an abundance of caution, given that this law has already been sort of pending for 270 days, what's another six months on this? So I suspect someone will pause the law, and therefore TikTok will be able to continue operating normally at least until the middle of the summer.
So I think most likely the Supreme Court will take the case, pause the law. We'll have to wait until the middle of the summer. We'll have to see how it all goes. I suspect there's not five votes, frankly, to save TikTok from this law. I think the fact that you had a sort of brutal loss for TikTok in the D.C. Circuit with a very cross ideological panel of judges, you know, an Obama appointee, a Trump appointee, and a Ronald Reagan nominated the Supreme Court, right? So that tells us something about how the Supreme Court will rule, but we'll have to wait and see.
In the meantime, what is it, if anything, that Trump administration can do? It's a bit head spinning because, of course, Trump tried to, as Anupam described at the beginning of our talk, he's the first person who tried to ban TikTok. Then, during the campaign, he changed his tune. He came out against the ban. Not clear why. Maybe because he was trying to play to the youth vote. Maybe because one of his big funders and supporters is a ByteDance investor and, therefore does not want this law to go into effect. It's unclear.
It's also unclear whether Trump still wants to help TikTok, he had an interview over the weekend, I forget with which network, in which this issue came up. And he did not say terribly optimistic things about TikTok, so. You know, Trump is hard to know, right? It's a black box and it all depends on the last person he talked to. So if you talk to, you know, Jeff Yass, he might want to save TikTok. If you talk to Marco Rubio, his new secretary of state, who's like a notorious China hawk, he's not going to want to help TikTok. But let's assume he wants to.
Anupam Chander: If you talk to Mark Zuckerberg also.
Alan Z. Rozenshtein: Yeah, exactly, right? It's all very unclear.
But let's assume he does want to help TikTok. What can he do? The thing he has to do, and this is, I think, really important, he has to help TikTok, but the people he has to convince are actually Apple and Google and Oracle. These are the people that the law goes after to a potential tune of $5,000 per user and with 170 million users, that adds up very quickly.
These are private companies. They can generally choose not to do business with whomever they want. So he has to convince them that they'll be safe. And he doesn't have a lot of great options. He could get Congress to repeal the law. I don't think he'll be able to do that. He could tell future Attorney General Pam Bondi not to enforce the law, but again, if you're the general counsel of Apple, does that really give you a lot of comfort given how mercurial Trump is?
The third thing he can do, and I suspect this is what he will try to do if he ultimately wants to help TikTok, which again, I want to emphasize is a total unknown, is simply declare under the law that there has been a qualified divestiture, even if there hasn't. And the reason he might be able to do this is because in defining what a qualified divestiture is, the law says the president shall determine after an interagency process that the factual predicates are met, you know, the TikTok’s no longer controlled by, by chance or a Chinese entity or some such.
Now that law, that language is somewhat ambiguous, does it simply require the president to make that determination before TikTok is okay? Or does it give the president a lot of discretion or something in between? We don't know. And it's actually not even clear who exactly would have standing to challenge that.
So it's possible, that if Trump wants to help TikTok, he will, sort of, he will get ByteDance to shuffle some papers around, you know, sell some assets from here to there. And then say, that gives them enough legal cover to just announce that a divestiture has happened. And then hope that gives the general counsel of Apple— whom I do not envy, though that's why they get paid the big bucks, presumably—enough comfort to go to Tim Cook and say, okay, Tim, I think we can do this. I can't guarantee it.
But, you know, especially if you want to give, if you want to help Trump, which these tech companies might want to do, that might just be enough. But I cannot emphasize enough how dicey all of this is, but because I am very doubtful that the Supreme Court will come and save TikTok.
And I'm, I know that Anupam and Mary-Rose think very differently on legal issues than I do, but I'm curious if they disagree with my sort of descriptive prediction as to the Supreme Court here because I suspect the courts are not going to save TikTok here. I think that only Trump can. But I'm not even sure he can, even if he wants to, which I'm also not sure. So, generally, I'm not sure.
Anupam Chander: I'm plus one in everything Alan just said.
Mary-Rose Papandrea: Wow, that's, I don't know where I am. I haven't…
Alan Z. Rozenshtein: Anupam, can I get a certificate suitable for framing on that, please? To hang it in my office.
Mary-Rose Papandrea: I think it could be interesting before the court, I mean, just to come back to Judge Srinivasan's concurrence, you know, the one way that TikTok could lose is if the court goes that way.
I think if the court buys that TikTok that has a, is a domestic company, you know, that's being, the regulator here is a domestic company operating domestically, I think it becomes much trickier, notwithstanding the court's traditional deference to the executive and, you know, congressional branches on its national security assessments.
That is, of course, the one caveat, a huge one, that I would place, but I'm not exactly sure. I think it could depend on how well the case gets briefed, what kinds of amici support comes to bear to explain the potential ramifications of the court's decision because I think there are some real problems with trying to cabin in this decision, particularly if we do accept that the regulation is impacting a domestic corporation, then what does that mean going forward? I think the court might be concerned about those ramifications.
Anupam Chander: Just want to say there's one additional thing that we haven't discussed, which is there is a focus by the D.C. Circuit on the divestiture portion of this law and not the ban portion of this law.
The law begins with a ban and it says you don't have to ban if there's a divestiture. That's the way the law is structured. By the way, the bill's sponsor, TikTok said this is a ban law. And the bill's sponsors literally, and I'm not kidding you, referred TikTok to the FTC for falsely describing this as a ban law, saying no, this is a divestiture law.
They were in fact themselves engaged in lying to the public because they kept on saying this is not a ban, this is a divestiture. Well, we may well come to see that it becomes a de facto ban because this company can't really divest itself for lots of different reasons. And the court deals with a lot of the questions by simply saying, oh, there'll be a new owner.
Okay, that's how the court repeatedly insists deals with the First Amendment problems. By saying no, no, no everything will be all fine, because someone else will run this up. And Americans won't actually have to move. And, if it says, if they have to move, it'll only be temporary. So that's how the court repeatedly insists upon this.
It's very clever, both by the government and then by the majority, in kind of minimizing the actual harm, which is, this app's going to shut down on January 19th, and Americans are going to be left trying to find some other way to speak with each other.
Scott Brennen: Why can't TikTok divest? Or ByteDance divest?
Anupam Chander: Okay, so ByteDance says there are business reasons, there are legal reasons, and technical reasons. So, the technical reasons says this is extremely complicated to create an America TikTok that is somehow unrooted from everything else. Okay.
Second, business reasons. Well, this is also very hard to sell an app which is only speaking to America. It can't really speak to the rest of the world. You can't, you know, now are you supposed to hire, you get supposed to get information from the rest of the world. You have two apps, one in Canada, Mexico, and one in the United States, and they can't talk to each other. The law actually prohibits ByteDance from having an operational relationship with a successor company.
Third there's legal reasons, which is that China has a export control on recommendation algorithms. And everyone expects that to bite here, and so that it can't sell the algorithm. Steve Mnuchin, the former Treasury Secretary, who was once for a ban, now wants to buy TikTok has said, oh, he'll buy it without the algorithm and recreate the algorithm.
Boy, if there isn't a kind of, you know, direct impact on speech, I don't know what is a direct impact on speech. You know, now Steve Mnuchin owns it and he's gonna write the algorithm. That changes things for TikTok's users and American speakers.
Scott Brennen: Well, we're going to have to end it there. Our time is up. I really want to thank all three of you for the fascinating discussion, and I'll say that the Center on Technical Policy, we organize briefings or conversations like this regularly, and maybe we'll do a part two when we have a Supreme Court decision or a miraculous Trump action down the line.
Alan Z. Rozenshtein: The Lawfare Podcast is produced in cooperation with the Brookings Institution. You can get ad-free versions of this and other Lawfare podcasts by becoming a Lawfare material supporter through our website, lawfaremedia.org/support. You'll also get access to special events and other content available only to our supporters.
Please rate and review us wherever you get your podcasts. Look out for our other podcasts including Rational Security, Chatter, Allies, and the Aftermath, our latest Lawfare Presents podcast series on the government's response to January 6th. Check out our written work at lawfaremedia.org. The podcast is edited by Jen Patja. Our theme song is from Alibi Music. As always, thank you for listening.