Cybersecurity & Tech

Facebook’s Role in Myanmar: New Report Puts the Company on Notice

Evelyn Douek
Wednesday, November 7, 2018, 10:49 AM

Most Americans were likely distracted with the looming midterm election on the evening of Nov. 5 when Facebook published a lengthy report assessing the effect of the company’s presence in Myanmar.

Published by The Lawfare Institute
in Cooperation With
Brookings

Most Americans were likely distracted with the looming midterm election on the evening of Nov. 5 when Facebook published a lengthy report assessing the effect of the company’s presence in Myanmar. The “Human Rights Impact Assessment,” funded by Facebook but conducted independently by the nonprofit organization Business for Social Responsibility (BSR), is a relatively measured and even-handed assessment of the company’s role in protecting and upholding human rights in the Southeast Asian nation. It does not focus on the past or the company’s slow response to the developing human rights catastrophe in the country over the past few years. But by providing concrete recommendations and clear forewarning of upcoming pressure-points, the report leaves no room for doubt that Facebook has a lot of work to do.

Facebook’s release of the report has largely been portrayed as an admission of its culpability in the ongoing genocide being carried out against Myanmar’s Rohingya minority. But this is not news—the company has for some time acknowledged that it had been too slow to take steps to help prevent the “horrific” violence in Myanmar. Facebook’s release of the report unredacted and in full—albeit on the eve of the U.S. midterm elections, when media focus was elsewhere—is one step toward realizing what Chief Operating Officer Sheryl Sandberg agreed was the company’s moral and legal obligation to prevent the use of its platform to incentivize violence.

The relatively high-level report:

  • emphasizes the positive aspects of Facebook’s role in Myanmar, and somewhat downplays Facebook’s responsibility for its adverse human rights impacts;
  • stresses the importance of context in determining how best to respect human rights, especially where rights come into conflict (such as the right to security and the right to freedom of expression);
  • highlights the difficult trade-offs for Facebook in responding to challenges, such as the potential operating difficulties and backlash it might face after its recent decision to remove a number of pages and accounts associated with the Myanmar military; and,
  • foreshadows key upcoming challenges for Facebook, including Myanmar’s upcoming 2020 elections and the possibility of increased WhatsApp use in the country.

What is a Human Rights Impact Assessment?

Such assessments are a part of the “human rights due diligence” called for by the U.N. Guiding Principles on Business and Human Rights (UNGPs), and are used to identify effects on human rights in which a company may be involved and the steps it can take to prevent or mitigate harm. The UNGPs are nonbinding and do not impose legal obligations on companies. Nevertheless, they are the only official guidance that the U.N. Human Rights Council has issued regarding the human rights responsibilities of companies and remain the global authoritative standard in the area.

Because Facebook is not bound by international human rights treaties, and because—as the report notes—the local laws in Myanmar do “not reflect universal principles of rule of law” and fail “to fully meet international standards” for human rights, there is no legal mechanism to hold the company accountable for its human rights impact. The UNGPs suggest, however, that companies will be subject to the “court of public opinion” if they fail to uphold human rights—which is exactly what has happened in this case. Facebook commissioned BSR to conduct the assessment after sustained criticism for its role in Myanmar. U.N. investigators stated that Facebook has played a “determining role” in the violence in Myanmar, and that, by spreading hate speech and posts inciting violence, the company had “turned into a beast.”

Ideally, because the goal is to prevent adverse human rights impacts, human rights impact assessments should be undertaken as early as possible in the development of a new business activity—in the case of Facebook, when it enters a country. In this respect, the assessment of Facebook activity in Myanmar is obviously not the ideal. It is relevant and important, then, that BSR calls on Facebook to undertake similar assessments in other high-risk markets now. Facebook is currently under criticism for its impact in many other countries, including the Philippines, Sri Lanka, and India, to name a few. BSR notes that Facebook has already initiated several other assessments, although it is unclear whether these will also be made public.

The HRIA’s Sympathetic Tone

The recent report by the U.N.-commissioned independent Fact Finding Mission in Myanmar found that there is “no doubt that the prevalence of hate speech in Myanmar significantly contributed to increased tension and a climate in which individuals and groups may become more receptive to incitement and calls for violence” and “[t]he role of social media [was] significant” in that development. In contrast, the BRA assessment is much more measured. The report repeatedly emphasizes the “democratizing” effect of Facebook in Myanmar, and the way the platform’s presence has “substantially increased opportunities for freedom of expression.” It suggests that Facebook simply relies on certain “legal, political, and cultural assumptions” that are not appropriate in the legal context of Myanmar, which functionally lacks the rule of law. It even quotes an anonymous stakeholder who says that “Facebook isn’t the problem; the context is the problem.” It highlights that the “actual relationship between content posted on Facebook and offline harm is not fully understood” and that it is only a “minority” of users who are seeking to use Facebook for malicious ends.

While this may be true, it will no doubt frustrate many observers, who will point to the fact that Facebook’s role in enabling violence was highly foreseeable—and in fact foreseen— many years ago. The assessment also states that Facebook “does not cause or contribute” to the risks caused by hate speech and misinformation being spread on its platform, although it is “directly linked” to them, and that “Facebook’s link to human rights violations in Myanmar should not be overestimated.” This seems to overlook any impact that Facebook’s News Feed algorithm might be having in the spread of such content. Adam Mosseri, the former head of the Facebook team that manages the News Feed algorithm, has previously acknowledged the impact that the company’s “incentives” around click-bait and sensational headlines can have in spreading hatred and propaganda.

In this regard, the assessment strikes a much more sympathetic tone than recent coverage of Facebook’s problems in the media and civil society. The report nevertheless makes 29 recommendations for specific steps that Facebook can take to better respect human rights in Myanmar—including greater transparency and stricter interpretation of its credible violence policy as it relates to misinformation, disseminating “counter hate speech” and preserving evidence of human rights abuses.

The Importance of Context

The assessment emphasizes the importance of appreciating local context in assessing how a company should approach its human rights obligations. The right to life, liberty and security of the person may be infringed as a result of hate speech and misinformation intended to incite violence. But the right to freedom of expression is also affected by the overbroad removal of content. The report describes the importance of local consultation in deciding how to best balance these sometimes competing obligations and shows that different communities may weight these priorities differently. In Myanmar, “most (but not all) local stakeholders are more concerned about security risks to rightsholders than they are about overbroad restrictions on content.” One anonymous stakeholder in Myanmar is quoted as saying, “Compared to many in the international community we are less concerned about restrictions to freedom of expression because our proximity to offline harm is much greater.” Another explains that because of the prevalence of bad actors on the platform and the self-censorship of vulnerable populations, Facebook “is no longer a marketplace of ideas.” These quotes vividly illustrate why Facebook might need to make different choices regarding how best to respect human rights in a context like Myanmar compared with that of the U.S. which prides itself on its tradition of robust public discourse.

Difficult Trade-Offs

The assessment describes the challenges of implementing Facebook’s content moderation policies as “of a nature and scale never previously addressed by companies or governments.” In particular, it references Facebook’s decision earlier this year to remove a number of senior officials in the Myanmar military from the platform, which it did in response to reporting from the New York Times that revealed the extent of the military’s covert influence operations. While the report calls this a “bold move” by the company that could be a significant blow to those committing serious human rights violations, it also notes that the decision increased the risk of retaliatory moves against Facebook on behalf of Myanmar’s government. The move might also prevent Facebook from setting up a local presence in Myanmar—which many in civil society have called for as necessary for Facebook to understand the local context—due to concerns for staff safety, according to the report. These considerations only compound the fraught question of what Facebook should do when state officials breach Facebook’s rules when communicating to their own constituents.

Key Upcoming Tests

The assessment states that the WhatsApp messenger app currently has a far lower user base than other Facebook products in Myanmar, but that Facebook should prepare for the possibility that it will become more widely used. WhatsApp has been blamed for the spread of hate speech and misinformation in countries such as Brazil and India, particularly because the encryption built into the app adds an additional layer of difficulty in acting against harmful content distributed on that platforms. The assessment’s warning of potential harms is therefore an urgent call to action and preparedness.

Similarly, the report highlights that Myanmar’s 2020 elections for the 75 percent of parliamentary seats that are not constitutionally reserved for appointments by the military are likely to cause an escalation of manipulative social media activity. It calls on Facebook to develop a risk-mitigation plan now and to ensure that, when the time comes, the platform is not distracted by the ongoing U.S. presidential election.

Facebook has already been taking steps to clean up its platform in Myanmar and acknowledging that it needs to do more. The new assessment may be dissatisfying for people who hoped for a substantive reckoning with Facebook’s role in the past and ongoing human rights atrocities committed in the country. But at the very least, it draws a clear line under the imminent challenges that Facebook will face. There will be no question this time around about whether the risks of the platform could have been foreseen.


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare