Facebook's Oversight Board: the ambitious experiment in content moderation

8/23/2021
Facebook's Oversight Board: the ambitious experiment in content moderation

In November 2019, Facebook CEO Mark Zuckerberg announced that he would create an independent body to review decisions made by Facebook and Instagram content moderators. 

The Oversight Board, or more colloquially, the "Supreme Court of Facebook", became operational in May 2020, when the names of its first members were announced. 

It is a body of twenty free speech experts from different professions, including academics, journalists and human rights activists. Its main task is to study content moderation cases appealed by users to revoke or ratify decisions made by Facebook. In addition, the body has the power to make recommendations to the company regarding the issues it deals with. 

In little more than a year, the Council has dealt with relevant cases on the community rules of this social network and the way they are applied. In its hands have fallen such sensitive matters as the suspension of former President Donald Trump's account in January 2021. On account of the Council's recommendation, Facebook modified the way it handled the speech of political leaders

Although the cases reviewed by the Council are dominated by those involving political and social conflicts around the world, some decisions have changed the landscape of what can and cannot be said on Facebook in a more everyday context.

The level of detail with which they examine community standards, user intent and the context of posts has given a new dimension to content moderation on Facebook.

We address some of the decisions below.

A review of Christmas traditions‍.

In December 2020, a user in the Netherlands posted a video featuring two white people with black face paint and dressed as Zwarte Piet, a character from that country's Christmas tradition known in Spanish as "Pedro el Negro" (Black Peter). 

Facebook removed the post for violating its policy on hate speech. In reviewing the case, the Council took into account that the character of Black Peter is part of a cultural tradition that many Dutch people share without racist intent, but the fact that people had their faces painted black implied a form of caricature linked to racist stereotypes, which is also prohibited under EU rules since August 2020. For that reason, and because it considered that allowing content of that nature could create a discriminatory environment for black people, the Council ratified the original decision to remove the publication.

Covid-19 disinformation vs. criticism of the government‍.

In October 2020, a French user posted a video accompanied by a text questioning French health authorities for refusing to recommend hydroxychloroquine and azymothricin as treatments for covid-19. According to the publication, these drugs were saving lives in other countries, while remdesivir, advised by the French agency, was ineffective. 

Facebook removed the post for violating its policy on violence and criminal conduct, which includes a prohibition on sharing misinformation that could provoke violence or physical risk. However, upon review of the case, the Council determined that the user's intention had been to criticize a government position and not to promote drugs that, moreover, are not available over-the-counter in France. It therefore annulled Facebook's decision and requested that the content be reinstated.‍

When systems see naked without context‍.

In October 2020 a user in Brazil posted on Instagram a series of images containing exposed nipples as part of a breast cancer awareness campaign. Automatic detection systems removed the post for allegedly violating the adult nudity and sexual activity policy.

The platform reinstated the content before the case was decided, but the Council considered it important to review it because this error highlighted the company's failures in human oversight of its moderation processes. Consequently, among other things, it recommended Facebook to improve automated image detection, ensure that users know the reasons why their content was removed and clarify that female nipples are allowed when it comes to this kind of campaign. 

Potentially any user who disagrees with a Facebook decision can appeal to the Council, once they exhaust the process within the platform. However, the agency only selects a few cases for review. In the time it has been operating, in which it has received more than half a million appeals, it has ruled fourteen times. 

Facebook can also send it cases when faced with difficult content moderation decisions, but as with user requests, the Board's reception is not the highest. According to the company's first quarterly report on the Oversight Board, of the 26 cases it sent to it through March 31, 2021, just 3 were considered.  

However, the pronouncements of this "Supreme Court", like those of the high courts of any country, set a precedent for Facebook to resolve cases similar to those under the guidelines of its decisions in the future.

By:
go home