Cybersecurity & Tech

It's Not Over. The Oversight Board’s Trump Decision Is Just the Start.

Evelyn Douek
Wednesday, May 5, 2021, 3:11 PM

What do the details of today's decision reveal about Facebook’s rules, and the FOB’s role in reviewing them?

Former President Donald Trump talks on the phone while aboard Air Force One in 2017. (Official White House photo by Shealah Craighead)

Published by The Lawfare Institute
in Cooperation With
Brookings

The long international nightmare is not over. By now, you will have read that on Wednesday the Facebook Oversight Board (FOB) upheld Facebook’s Jan. 7 restriction on former President Donald Trump’s account, largely on the basis of the ongoing violence at the time of the posts that led to the ban. But the FOB did not settle the matter for once and for all: It punted back to Facebook the question of what to do with the account now. It dinged Facebook’s “indefinite” ban as a “vague, standardless penalty”—“indefinite,” according to the FOB, is very much not synonymous with “permanent.” Now, Facebook has six months to conduct a review of what to do with Trump’s account. 

The decision is meaty and educational. It contains a number of recommendations that, if Facebook follows them, will significantly improve the clarity and mitigate the arbitrariness of Facebook’s decision-making. It is also an attempt to split the baby—not letting Trump back on, but also not demanding a permanent ban of his account—and to avoid the inevitable controversy that would have attended any final decision. A Pew research poll released today found that Americans were essentially completely evenly split on whether Trump should be allowed back on social media. Any definitive decision would have made lots of people very unhappy. But in trying to preserve its own legitimacy, the FOB’s decision here has just deferred the controversy and doomed the public to more unending Trump-focused news cycles for up to another six months, and potentially longer.

There will be plenty of hot takes about the decision. This post is not another one. I focus on the details of the decision and what they reveal about Facebook’s rules, and the FOB’s role in reviewing them. There’s lots to like in the FOB’s decision, and it proves why the whole experiment shows promise. It also shows that the FOB continues to rely too heavily on a state-based conception of governance and to ignore the significant ways in which its role is different from that of a regular court. Its job was to constrain Facebook’s discretion, and it pretty comprehensively refused.

The Substance of the FOB’s Decision Regarding Trump’s Account

The central part of the FOB’s decision turned on what the FOB views as the completely arbitrary nature of Facebook’s punishment for Trump. Repeatedly throughout the decision, the FOB admonishes Facebook for lack of clarity in its rules. It writes that the case “highlights further deficiencies in Facebook’s policies.” In a sentence tucked right at the end of the decision, the FOB notes that this lack of transparency means that there is not “adequate guidance to regulate Facebook’s exercise of discretion.” That tiny sentence gets to the core of why the FOB is important.

As I wrote at the time, the central problem with Facebook’s decision was not the substance of it, but that it appeared politically and strategically convenient in a moment when platforms all across the internet were taking action against Trump-related accounts. Mark Zuckerberg was the ultimate arbiter of the decision, and he could have made it on the basis of little more than a coin toss given the lack of any “clear, published procedure” guiding him. 

The whole ambition of the FOB exercise is that it will constrain the completely unaccountable power that Facebook currently exercises over what the FOB calls, correctly, “a virtually indispensable medium for political discourse, and especially so in election periods.” The FOB’s decision calls for Facebook to do better, but it’s ultimately too cautious an intervention. 

The FOB’s focus in the opinion was on the “indefinite” nature of the suspension. In this case, the “indefinite” suspension was completely sui generis. As I wrote on Jan. 11 when I argued that Facebook should refer Trump’s suspension to the FOB, such a suspension is

an unusual remedy—I know of no case where Facebook has taken this approach of banning an account for an unspecified length of time before—and one that creates a great deal of uncertainty. Whether Trump will get his account back, in other words, appears to depend on how Zuckerberg feels after Biden’s inauguration. The Oversight Board might have thoughts about whether this is a legitimate way to resolve the years of sparring between Facebook and its most famous troll.

The FOB did indeed have thoughts! It stated that Facebook must review the matter and come up with a more proportionate response within six months.

The Minority View

FOB decisions will sometimes note when a minority of panel members dissent on any particular part of the decision. That featured here, where a minority of the FOB would have gone further. The minority would have found that Trump had violated more Community Standards than Facebook determined he had. The minority also would have emphasized the importance of “Dignity” and Trump’s undermining it in his previous posts contributing to racial tension and exclusion (including the infamous “when the looting starts, the shooting starts” post and multiple posts referencing the “China Virus”).

This is worth pausing on. One of the dominant criticisms of the FOB’s decisions so far, which have primarily ordered Facebook to reinstate posts that it had removed, is that the FOB seems overly speech protective. Some assumed for this reason that the FOB would order Trump’s reinstatement to the platform. It didn’t, and a minority expressed views suggesting even more speech-restrictive views than the majority. The sample size is still small, so making generalizations remains unwise, but this does suggest that the FOB is less speech-absolutist than some critics feared.

The Educational Value of the Decision

One of the main promises of the FOB experiment, if it works, is that it helps dig up pieces of information about the otherwise almost entirely opaque Facebook content moderation system. The FOB’s decisions have fulfilled this so far, and the Trump decision is no exception. I follow this space closely and I learned things from the decision.

Some of the more important things we did not know before but which the FOB surfaced in the decision:

  • Facebook applies a “cross check” system to some “high profile” accounts to “minimize the risk of errors in enforcement.” This means that the “decision-making processes are different for some “high profile” users.” All the relevant terms here are left undefined, but these appear to be euphemisms that suggest that Facebook has an escalation process for decisions that might prove controversial. This is not exactly news: Leaks have suggested as much before. Confirming this process, however, gets at one of the most problematic parts of Facebook’s approach to content moderation: Who does the “cross-check”? Do officials in the public policy department intervene for such high-profile decisions? In my view, there should be a wall between rule-enforcement and public policy within the company, so that political or business considerations do not unduly determine individual outcomes. The decision—and Facebook’s obligation to respond to the policy recommendations it contains—offers Facebook the opportunity to clarify in its response what this process involves.
  • Facebook “never applied the newsworthiness allowance to content posted by the Trump Facebook page or lnstagram account.” This is quite extraordinary. Facebook’s treatment of public figures has been one of its most controversial policies. The platform treats speech from politicians as inherently newsworthy and therefore considers it in the public interest for that speech to be seen, even if it breaks the platform’s community standards. It had regularly generally gestured at this policy when people called for Trump’s deplatforming or the removal of individual posts, and referenced it in the referral of the Trump case to the FOB. But when pushed to provide details, it whiffed. 
  • Despite this newsworthiness policy, Facebook had found Trump had violated its rules five times before Jan. 6, and three times within the last year. There were also 20 pieces of content posted by Trump that were flagged by content reviewers or automation as violating Facebook’s Community Standards but were “ultimately determined,” presumably as part of the “cross-check” process, to not be violations. The exact posts are not specified. Who ultimately overruled the initial decision-makers?

But Facebook wasn’t always so generous in its responses. The FOB asked Facebook 46 questions as part of its consideration process, and Facebook declined to answer seven entirely and two partially—Facebook characterized responses on those questions as “not reasonably required,” “not technically feasible to provide,” or protected for other reasons, like legal obligations. It’s impossible to tell if Facebook was declining the FOB’s requests in good faith or to protect itself. The questions it rebuffed included extremely important matters about how the News Feed impacted the visibility of Trump’s content (was it amplifying it, for example?), and whether Facebook had researched design decisions that contributed to the events of Jan. 6. As the Knight First Amendment Institute explained in its submission to the FOB, these design decisions are far more significant in determining Facebook’s impact and responsibility than its treatment of any one account.

Facebook also declined to answer how other political leaders had been treated on its platform to date, or how much contact it has with governments. The relationship between Facebook and governments remains murky.

Some Lines the FOB Did Draw

Facebook wanted the FOB to suggest what its policy for heads of state should be, but the FOB ducked that big question. It did, however, use its nonbinding policy recommendations to draw some lines and settle some important questions that have been the source of ongoing controversy: 

  • All users should be held to the same content policies. Political leaders should not get special treatment.
  • But there are some unique factors that must be considered in assessing the speech of political leaders. The FOB noted (drawing on the Rabat Plan of Action developed by experts supported by the U.N., which Facebook also referred to) that the reach of the account and status of the speaker means that, if anything, political leaders have a high degree of influence and should be held to a higher standard. The FOB also recommended Facebook should also prioritize review of posts of highly influential users (not confined to government actors) given their capacity to reach more people more quickly with more impact (although this is nonbinding). Many experts have been calling for this. The Election Integrity Partnership, a coalition of experts who are the leading voices on social media’s role in the election, found that repeat, influential accounts were especially important in spreading false claims about the election.
  • The FOB condoned Facebook’s attention to off-platform context in evaluating the meaning of the posts on its platform.
  • The proportionality of Facebook’s response to violation of its rules will continue to be a key guiding principle—indeed, “the crucial question”—in the FOB’s decisions, emphasizing the importance of speech and the need to take the least restrictive means of limiting it. 
  • The FOB also firmly rejected the submission made on Trump’s behalf that the FOB should defer to American law in its decisions, stating that the First Amendment does not govern private companies.

Things the FOB Punted

The FOB steadfastly refused to give Facebook any concrete guidance on what it should do going forward, however. It left many, many questions unanswered and ambiguous. The FOB essentially left Facebook entirely on its own to work out the contours of its ongoing policy.

The FOB does not tell Facebook what it should look at or how far it should go when looking at off-platform context to determine the meaning of speech on its platform. Is mainstream media enough? Mainstream social media? Does it need to look at Twitter, 8chan, political leaders’ personal blogs (newly relevant)?

And it also doesn’t clear up the thorny question of what the FOB will do when Facebook’s rules conflict with international human rights law (IHRL). The FOB continues its emphasis on IHRL as one of its guiding principles—in fact, this decision goes farther than any prior one in doing so. The decision’s references to Facebook’s own rules and values are cursory, and the bulk of the analysis concerns IHRL authorities. This suggests, but still does not definitively answer, that when it finds Facebook’s own rules conflict with IHRL, the FOB will apply the latter and overrule Facebook’s own rules. This is not entirely surprising, but it’s worth stopping to note how weird it is that a body set up by Facebook is providing the most extensive analysis of how IHRL applies to content moderation and overruling the very company that set it up.

But the FOB is still refusing to do the hard work of explaining what it actually means in this context to apply IHRL to a private company. I have written about how IHRL is too vague and abstract to answer many if not most concrete content moderation cases. Furthermore, IHRL is written for states. Facebook, whatever its power, is not a state. How IHRL needs to be adapted to this entirely different context is the hardest question and one that the FOB notes in passing here (as it has once before), but doesn’t tackle head-on. 

Apart from that, the FOB also does not even answer whether it can ever be a “proportionate” remedy to permanently suspend a user account. Its emphasis on the need for remedies to be proportionate suggests it would not think so except in the most extreme circumstances, but it shied away from just coming out and saying this directly.

The FOB also punted on yet another important issue—indeed, to my mind the most important issue. Facebook asked the FOB two questions. The first, the subject of all the media attention, was what to do with President Trump’s account. The second, which is far important for the future of politics both on- and off-line around the world, was what to do with political leaders’ accounts more generally. The FOB almost entirely ducked this second question, not once referencing any other world leaders’ accounts or outlining any specific policy guidance. 

The minority argued that the FOB should have been more specific in the guidance it provided Facebook. It stated that “it is important to outline some minimum criteria that reflect the Board’s assessment of Facebook’s human rights responsibilities.” The minority is right. Many have praised the FOB’s call to put the onus entirely back on Facebook, but I think it’s a mistake, and will explain why below. But first, what happens next?

What to Watch in 30 Days

While the press circus has focused on the decision today, the two most important moments in this process have yet to come. The first is in 30 days, on June 4, the date by which Facebook has committed under the FOB’s bylaws to respond to the FOB’s nonbinding policy recommendations. The second is whatever point within the next six months when Facebook responds with its new account suspension policy and final decision on Trump’s account. 

When the FOB released its first set of decisions back in January, there was a flurry of attention on the day and then essentially no coverage of Facebook’s responses 30 days later. Ignoring Facebook’s responses is a mistake. How Facebook engages with the FOB’s nonbinding but more sweeping recommendations are far more consequential than what happens to one post or one account.

There are some extremely important and extremely broad policy recommendations in this decision. Beyond requiring more transparency and specificity on policies regarding account suspensions and newsworthiness, the FOB’s decision makes operational and information-forcing demands.

It calls for more prioritization and resources dedicated to moderating influential accounts. It calls for greater explanation of the alluring and mysterious “cross-check review” policy and, to my utter delight, publication of comparative error rates as a result of this process. It calls for transparency reporting to be expanded to include decisions on profiles, pages, and account restrictions, broken down by countries, rather than just information on individual content decisions. All of this is long overdue from Facebook, and shouldn’t require the FOB’s prodding to get but vindicates the FOB’s value-add.

The FOB calls for Facebook to make its policies on strikes and penalties process transparent. This is also long overdue—making such policies transparent not only is important for due process but also can help foster a healthier environment by encouraging user behavior toward compliance if they know what they are doing wrong.

Facebook should also, the FOB says, make clear “how it collects, preserves and, where appropriate, shares information to assist in investigation and potential prosecution” of the law. Interestingly, especially given the important role Facebook posts have played in the domestic prosecutions of people involved in the events of Jan. 6, the FOB confines this to violations of international law. The FOB suggests that Facebook should also publish a policy on what it will do in crisis events where its regular processes are inadequate.

The FOB also calls for “a comprehensive review of Facebook’s potential contribution to the narrative of electoral fraud and the exacerbated tensions.” BuzzFeed has reported that Facebook has done something along these lines internally, but the FOB calls for this to be an “open reflection.” 

The FOB has called for a comprehensive review like this before in one of its first decisions, asking for a specific transparency report on Facebook’s enforcement practices during the pandemic. Facebook “committed to action” in response. But as I explained at the time, this was misleading. Its commitment was to continue sharing the details it had been releasing, which the FOB had looked at and found inadequate. 

Still, Facebook has embraced many of the FOB’s nonbinding recommendations in other decisions. In 30 days we’ll find out if Facebook will continue to do so, or if it’s beginning to regret setting up a body that is calling for much broader actions than the limited remit Facebook has decided to give it would have suggested.

The FOB’s Broader Role, and the Ongoing Dance

The FOB was clearly aware of the high-profile nature of this decision and the fact that it probably represented its best opportunity to explain how it views its role in the Facebook governance system. It stated that it is Facebook’s role to create its policies, and the FOB views its role as only reviewing whether Facebook’s decisions are consistent with those policies, its values and its IHRL commitments. Our role, says the FOB, is to call balls and strikes.

This is obviously akin to how a court might apply judicial review in a state-based constitutional system or, even more specifically, the American system. But Facebook is not a state, and the FOB is not a court. The FOB stated that “Facebook seeks to avoid its responsibilities,” by applying an indeterminate and standardless penalty and then chucking it to the FOB to deal with. But arguably, setting up the FOB was Facebook’s attempt to fulfill its responsibilities. The FOB was always intended to provide nonbinding policy recommendations. Leaving policy decisions to the legislative and executive branches in a state context makes sense because they are the democratically accountable arms of the government. In theory at least, they will bear the political costs of any decision. There is no such accountability mechanism in the Facebook ecosystem and, despite years of controversy, Facebook continues to experience explosive growth. The FOB is necessary as an external review body precisely because no other accountability mechanism—commercial, regulatory or otherwise—exists or will in the near future.

Sure, the indefinite suspension of Trump’s account was standardless, but the “indefinite” nature of the suspension was in part a direct product of the referral of the decision to the FOB. Facebook was holding the account in abeyance while awaiting the FOB’s decision. As the FOB itself acknowledged, the aftermath of Jan. 6 was effectively an emergency and required Facebook to act quickly. While the FOB is right to call on Facebook to produce rules governing what it will do in similar situations in the future, it’s hard to blame Facebook for not going through a full policy development process at the time. But instead of anyone taking ownership of the issue now that the immediate emergency has passed, the FOB and Facebook keep throwing the content moderation hot potato back and forth.

Lawyers will be tempted (and indeed already have started) to see echoes of Marbury v. Madison where the Supreme Court famously disavowed that it had jurisdiction to answer the question in the particular case and in doing so claimed for itself the much greater power to review and strike down pieces of legislation. They should not succumb to making the comparison. And if the FOB was trying to emulate Marbury, it was wrong to do so. The reason why Marbury was so momentous was it was not clear whether the Constitution had given the court the power of judicial review. That is not the case with the FOB. The FOB was set up entirely to decide the hard cases. When I argued that Facebook should refer the decision to suspend Trump’s account to the FOB, I did so because reviewing Facebook’s most difficult decisions is literally FOB’s job description. The FOB agreed before now. When the FOB was first announced, the co-chairs wrote in the New York Times that the FOB will “make final and binding decisions.” This is its entire purpose and the reason it was created

The formalistic take on this decision is that the FOB truly believes its role is this more limited one. The realist take is that the FOB was trying to avoid controversy and prioritize its own legitimacy. Legitimacy is built over time, and the importance of this one decision should not be overstated, but it’s also obviously true that this decision will attract more attention than any other (except, perhaps, one within the next six months). The temptation for board members to shield the FOB from the worst of the fallout of the ongoing Facebook/Trump debacle is obvious. But the FOB members get paid six figures for a reason, and that reason is in part because they signed up to be very unpopular. Their role is to constrain Facebook’s, and Mark Zuckerberg’s, discretion. The FOB has declined to do that almost entirely and did not even provide meaningful parameters of the policies it calls on Facebook to develop. 

Perhaps the approach counseled by the realist take is right, and the young and fragile FOB needed time before subjecting itself to too much scrutiny. If so, is six months enough time? Because that’s what is going to happen. Within six months, Facebook will develop a policy, there will be controversy about its application, and—lo and behold—we may well end up at the FOB yet again. 


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare