Facebook Releases Civil Rights Audit Progress Report

Evelyn Douek
Monday, July 1, 2019, 5:20 PM

Facebook has released an update on its ongoing civil rights audit, illustrating the wide range of effects the company has on civil rights—from facilitating racially discriminatory ads for housing, employment and credit, to concerns about use of the platform to suppress participation in the 2020 U.S. election and census.

Published by The Lawfare Institute
in Cooperation With
Brookings

Facebook has released an update on its ongoing civil rights audit, illustrating the wide range of effects the company has on civil rights—from facilitating racially discriminatory ads for housing, employment and credit, to concerns about use of the platform to suppress participation in the 2020 U.S. election and census. The report is the second update by auditor Laura Murphy, a former director at the ACLU and highly respected civil rights advocate, who was hired to lead Facebook’s civil rights audit in May 2018. With a third and final report due in 2020, this interim report details the steps the company has taken to improve its civil rights impact in the past six months. Murphy’s report says that Facebook has made “meaningful progress,” but it also shows just how much more work there is to do.

The report, attached below, surveys Facebook’s efforts in four areas:

  • Improving its policies around hate and other harmful content, and the enforcement of those policies.
  • Overhauling its advertising systems to prevent discriminatory targeting of ads for housing, employment and credit, the subject of a settlement of multiple discrimination lawsuits in March 2019.
  • Protecting against misinformation and suppression efforts in the lead-up to the U.S. election and 2020 census.
  • Creating a more effective accountability structure for civil rights issues within the company, including the creation of a specific civil rights task force led by Chief Operating Officer Sheryl Sandberg.

While some civil rights leaders have called Facebook’s efforts “woefully inadequate,” Murphy strikes a more optimistic tone in her report, stating that the company has shown “a concrete commitment to further improvement.” Overall, the report shows that there can be benefits to voluntary company initiatives to improve their record on human rights. The broad consultation with the civil rights community highlighted Facebook’s blind spots in relation to the four areas above, but the company’s cooperation meant that the auditors were also able to pair these critiques with detailed recommendations grounded in the reality of Facebook’s operations.

For example, auditors were able to investigate a sample of instances where the company had incorrectly taken down content as “hate speech” and identify specific measures Facebook could take to reduce error rates. Auditors note that their investigations show the very real trade-offs between consistency of decision-making and accounting for nuance and context in policies around difficult topics such as hate speech. But because of the auditors’ firsthand experience, the report offers concrete recommendations to help lessen these tensions. In particular, it suggests improving the interface used by content reviewers to examine posts so that more context will be readily apparent; providing specific training to reviewers who moderate hate speech; and requiring reviewers to answer questions about content they are reviewing before they decide whether the post should be taken down so that decisions are more reasoned. (Currently this is done in the reverse order.)

The fact that Facebook had failed to take these relatively simple measures without external review suggests fundamental weaknesses in the way the company considers its impact on human rights. As such, the creation of a separate team with accountability for these impacts is an important corrective.

The report also has broader recommendations for how Facebook can further improve its effect on marginalized communities, including broadening its bans on hate speech to include implicit and not only explicit praise for white nationalism and white separatism; removing its exception for humor (which the report says “is not standard, objective, or clearly defined”); and allowing for bulk reporting of harassment in instances when a user is “flooded” by posts. It also publicly documents Facebook’s commitments to prevent manipulation of the 2020 election and census, which the report explicitly notes must be understood in the context of the “long history in this country” of voter suppression and intimidation. By setting these benchmarks for assessing the company’s performance, this report ensures that 2020 will be, in Sandberg’s words, a “big year” for Facebook—as well as for the people affected by its operations.


Evelyn Douek is an Assistant Professor of Law at Stanford Law School and Senior Research Fellow at the Knight First Amendment Institute at Columbia University. She holds a doctorate from Harvard Law School on the topic of private and public regulation of online speech. Prior to attending HLS, Evelyn was an Associate (clerk) to the Honourable Chief Justice Susan Kiefel of the High Court of Australia. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.

Subscribe to Lawfare