Cybersecurity & Tech

What the Oversight Board Should Do With Trump

Benjamin Wittes
Tuesday, May 4, 2021, 6:42 PM

Can Facebook’s independent overseers make an easy case hard?

 

Scott Beale/Laughing Squid"/laughingsquid.com (https://flic.kr/p/2vbhJP)

Published by The Lawfare Institute
in Cooperation With
Brookings

Tomorrow morning, May 5, the Oversight Board will issue its opinion in the case of Facebook’s indefinite suspension of Donald Trump in the wake of the Jan. 6 insurrection. 

The opinion will attract enormous attention, whatever the board ends up doing. If it overturns Facebook’s action, it will provoke grave concern among those who have been demanding more rigorous content moderation on the platform. If it upholds the suspension, it will provoke howls of rage from Trump supporters and those who believe that social media is biased against conservatives. Let Trump back on, and the comparatively peaceful social media ecosystem that has prevailed over the past several months could suddenly grow cacophonous again as he blitzes Facebook with election lies and attacks on his opponents. Keeping the former president in Facebook detention, by contrast, means the board’s acceptance that social media companies have the power to silence, at least on their platforms, elected world leaders.

And then, of course, the Oversight Board could punt—thus irritating everyone. There is some ambiguity in Facebook’s decision as to the finality of its decision concerning Trump. And there is ambiguity as well about Facebook’s rationale, precisely which rules Trump violated and how. So one could imagine the Oversight Board donning an administrative law hat and kicking things back to Facebook for further review. 

It’s easy to see why Facebook would want an Oversight Board decision on this matter. Assuming it gets a clean decision, it gets to comply with someone else’s edict, rather than making and being accountable for a decision of its own. If people are mad that Trump is back, they can be mad at the Oversight Board, not at Facebook. If they’re enraged that some group of law professors, politicians and pointy heads are silencing Trump, they can direct their rage at the board too. Either way, Facebook is just following the rules it set up, and hey, that is what constraint on power looks like. As Facebook put it in a statement positively dripping with enthusiasm for the top cover the Oversight Board offers it, 

Whether you believe the decision was justified or not, many people are understandably uncomfortable with the idea that tech companies have the power to ban elected leaders. Many argue private companies like Facebook shouldn’t be making these big decisions on their own. We agree.

For Facebook, the problem arises if the Oversight Board kicks the matter back to the company’s hands, in which case the ball is once again in its court and the eyes once again on its leaders. There’s also the thorny question of how Facebook might respond to the Oversight Board’s nonbinding policy recommendations.

But how should the Oversight Board handle the case? Quite apart from the inevitable political anger the board’s decision will provoke, what’s the right answer here?

The case actually presents two questions, one easy and the other impossible—and therefore also easy. Here are some thoughts about the proper resolution of this matter. 

According to the Oversight Board’s public account of the case, Facebook referred the following questions: 

  • Considering Facebook's values, specifically its commitment to “Voice” and “Safety,” did it correctly decide on January 7, 2021, to prohibit Donald J. Trump's access to posting content on Facebook and Instagram for an indefinite amount of time? 
  • Facebook also requested the Board’s observations or recommendations on suspensions when the user is a political leader.

Let’s start by considering the second question—which is the impossible one. Note first that this is not a request for an adjudication, but for an advisory opinion. Per the Oversight Board’s bylaws, it has the power to offer nonbinding policy recommendations to Facebook—and Facebook has to issue a response to the recommendations. No federal court in this country would entertain such a question, as it lacks a live case or controversy, though other countries’ judiciaries do entertain these kinds of hypotheticals. This is basically Facebook asking for a disquisition from the board on the hypothetical question of what to do if some political leader at some time in the future violates some unspecified rule in some equally unspecified set of factual circumstances. And while addressing such policy questions is theoretically part of the Oversight Board’s mandate, the question here is too hopelessly broad to be answerable. The request covers situations as variable as, say, Kim Jong Un posting pornography, on the one hand, or some candidate for office in Nebraska calling for violence against her opponent, on the other—not to mention those serial violations of the Digital Millennium Copyright Act by U.K. Prime Minister Boris Johnson.

There’s not much that the board can say in the abstract about such situations that would be useful. Yes, the board has given prospective guidance in the past, but not on an issue like this. The board can opine that the underlying policies need to be reasonable and clear. And it can intone sagely that the policies’ application to political engagement by leaders needs to be both cautious—so as not to unduly interfere with the political system of the jurisdiction in question—and firm and even-handed, so as not to enable the misconduct of political leaders by dint of their status. Beyond such pablum, though, things get really vague, really fast, and the first decision-maker regarding Facebook policies should be, well, Facebook.

So in my judgment, for whatever that’s worth, the proper response to the second question is minimal—either to kick the matter back to Facebook for the development of rules or to say something broad and general like this: Suspensions of political leaders should be undertaken with great care and only after due deliberations and should depend on (a) the clarity of the violation, (b) the seriousness of the violation, (c) the harms caused by or potential harms threated by the violation, and (d) the potential impact of suspension on the political functioning of the jurisdiction in question. The former course would probably be more useful.

This brings us to the more tangible and conventionally justiciable question before the board: Was Facebook’s decision to suspend Trump proper under its policies and other relevant sources of legal guidance?

There will be a temptation to over-lawyer this question. The Oversight Board is made up disproportionately of lawyers and legal scholars, and such people like making rules. The board also has a lot of free speech scholars, journalists, and international human rights lawyers among its ranks—a fact that tends to put a free-expression thumb on the scale. 

Yet the relevant platform rules are actually simple, at least as applied to the facts of what Trump did, and they give rise to a pretty simple answer if one takes them seriously. Facebook’s Community Standards, as a preliminary matter, make clear that they “apply to everyone, all around the world, and to all types of content.” There is no carve-out for presidents trying to stay in office after losing an election or for narcissistic political leaders incredulous that they may have lost an election. Facebook also has a rule allowing it to waive other rules when political leaders do things that are newsworthy. And it had clearly used this rule to avoid enforcing its more general rules in the past, but that stopped in response to Trump’s posts on Jan. 6 because of the evident public safety concerns.

Facebook is also clear that “[t]he consequences for violating our Community Standards vary depending on the severity of the violation and the person’s history on the platform. For instance, we may warn someone for a first violation, but if they continue to violate our policies, we may restrict their ability to post on Facebook or disable their profile.” 

This is actually consistent with the way Facebook behaved in the time immediately predating the posts that got Trump banned. As the Oversight Board summarizes the matter, Facebook responded to an initial Trump post on Jan. 6 by removing it “under its policy prohibiting praise, support, and representation of events that Facebook designates as ‘violating.’” It responded to a second post the same day also by removing it “under the same Standard, but has not yet clarified the specific aspect of the policy that it applied.” It also “put in place a 24-hour ban on President Trump's ability to post on Facebook or Instagram when it removed the second post.” The following day,

after further review of President Trump's posts, his recent communications outside of Facebook, and additional information about the severity of the violence at the Capitol, Facebook “extend[ed] the block” it placed on his accounts “indefinitely and for at least the next two weeks until the peaceful transition of power is complete,” publicly citing President Trump's “use of our platform to incite violent insurrection against a democratically elected government.”

As the board notes, Facebook was a little vague about what specific community standards one of the posts may have violated, and it was vague as well about what precise action it had taken. Does “indefinitely” mean a permanent ban? Does it mean until circumstances change? One could imagine the board using such ambiguity, as a court might, to decline to address the question—at least for now.

That said, the question is still an easy one: Holding the two posts at issue up against the community standards, it’s very hard not to see the violation. 

Here are Trump’s two posts as the Board situates them:

Post 1: As rioters were still present in the Capitol and backup law enforcement personnel were en route, President Trump posted a one-minute video to Facebook and Instagram with the following content: “I know your pain, I know you’re hurt. We had an election that was stolen from us. It was a landslide election and everyone knows it, especially the other side. But you have to go home now. We have to have peace. We have to have law and order. We have to respect our great people in law and order. We don’t want anybody hurt. It’s a very tough period of time. There’s never been a time like this, where such a thing happened, where they could take it away from all of us – from me, from you, from our country. This was a fraudulent election, but we can't play into the hands of these people. We have to have peace. So go home. We love you. You're very special. You've seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace." 

Post 2: As police were securing the Capitol, President Trump posted a written statement on Facebook: “These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long. Go home with love in peace. Remember this day forever!"

First, let’s consider Facebook’s community standard concerning Dangerous Individuals or Organizations, which reads: “In an effort to prevent and disrupt real-world harm, we do not allow any organisations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook. ... We also remove content that expresses support or praise for groups, leaders or individuals involved in these activities” (emphasis added). 

Previous Oversight Board decisions make clear that this policy covers organizations and individuals designated by Facebook as dangerous, so it is not obvious how precisely it applies here—though the Oversight Board summary makes clear that it formed part of the basis of Facebook’s action against Trump. One can, I suppose, quibble over whether Trump was engaged in violence or merely delighting in it. But there is no question that “expresse[d] support or praise for groups, leaders or individuals involved in” violence when he posted on Facebook a video of himself saying to people who were at that time violently occupying the Capitol: “We love you. You’re very special. You’ve seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace.” Nor can there be any real question that he expressed support for the rioters when he declared: “These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long.”

While Trump’s actions almost certainly don’t constitute incitement to violence or insurrection for criminal purposes, Facebook’s Community Standards don’t incorporate the Brandenburg test. Rather, the section entitled Violence and Incitement states that “[w]e aim to prevent potential offline harm that may be related to content on Facebook” and declares that “we remove language that incites or facilitates serious violence. We remove content, disable accounts and work with law enforcement when we believe that there is a genuine risk of physical harm or direct threats to public safety.” In this instance, “serious violence” was ongoing and Trump was using his account to justify it and to express support for those who were engaged in it. It just does not seem plausible for the Oversight Board to contend that Trump was not using the platform to “facilitate” violence, even if he was not in a technical legal sense using it to incite violence. An earlier Oversight Board case seemed to take an approach to incitement that incorporated an imminence standard of some sort (and another case also took on the question of “imminence”), so I suppose that could be basis for skepticism as to Facebook’s action. But here the argument for an imminent relationship between the posts and violence is pretty strong. After all, violence was already ongoing. 

Finally, Facebook’s Community Standards also forbid Coordinating Harm and Publicizing Crime: “we prohibit people from facilitating, organising, promoting or admitting to certain criminal or harmful activities targeted at people, businesses, property or animals” (emphasis added). When a political leader makes a video directed at people who are in the middle of committing a variety of crimes and tells them he loves them and that they are very special and justifies their actions by reiterating the lies that drive them, he is certainly “promoting” their criminal activity.

There is no serious question, in other words, that Trump violated the rules or that the rules contemplate action like the ones Facebook took. For the board to hold otherwise would be to turn Facebook’s generic commitment to giving people a “voice” into some overriding protection that compels Facebook to allow a politician to stoke political violence. 

That’s a greater speech protection than either the House of Representatives or the Senate was willing to give Trump. A bipartisan majority of both of those houses, after all, regarded his conduct on Jan. 6 as warranting impeachment and removal from office. Surely, the Oversight Board does not mean to argue that Facebook is somehow obliged to tolerate behavior the American political system regards as warranting a Senate trial. At a minimum, a decision to overrule Facebook’s judgment would require an explanation of why encouraging a mob in a fashion that might merit impeachment does not merit action by Facebook in the interests of public safety. 

There may be technical reasons, discussed above, for the Oversight Board not to affirmatively uphold Facebook’s action here, in which case the matter may drag on. But if it overturns it, it will effectively be preventing Facebook from taking reasonable content action in emergency circumstances in which political violence is unfolding. Facebook, after all, did the bare minimum here—and it did it very late. It waited until a violent crisis erupted during the period of the presidential transition of power. It acted only when, quite literally, the functioning of the United States government had been disrupted by a violent incursion into the Capitol and the country’s president used the platform to justify the invasion. If Facebook’s action under these circumstances is overturned, the Oversight Board will be pushing Facebook in exactly the wrong direction.


Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare