Deteriorating public debate and regulation: Inter-American Commission shows its cards for Internet governance

9 minutes
7/22/2024
Deteriorating public debate and regulation: Inter-American Commission shows its cards for Internet governance

This week, the Inter-American Commission on Human Rights (IACHR) joined the institutional and civil society actors that have recently proposed ways to regulate internet platforms in the region. At the request of the Organization of American States, the IACHR's Office of the Special Rapporteur for Freedom of Expression prepared a report that addresses the deterioration of public debate, universal access and digital literacy, and the governance of online content.

The Special Rapporteurship's contribution is based on a consultation with various actors and on inter-American human rights doctrine and precedent. It is undoubtedly a relevant input, both because of the institutional weight of the IACHR and the need to guide legislative projects that, from different perspectives, are currently being discussed in Brazil, Colombia, Mexico and Argentina, among others.

The report diagnoses the role of platforms in the formation of public opinion and their impact on freedom of expression. According to the Office of the Special Rapporteur, the amplification of extremist speech, discriminatory content and attacks against social leaders, the media and civil society have resulted in inequality and censorship.

The Office of the Special Rapporteur is aware that some moderation practices may end up limiting protected speech, but also that it is necessary to intervene to prevent hate speech and digital violence from leading to self-censorship and exclusion of minorities. In these spaces, at the mercy of commercial logics and technological interventions -through algorithms and Artificial Intelligence systems-, public discussion has deteriorated. However, this is a scenario of loss of trust in which the role of platforms is only part of the equation.

The report draws attention to the abusive use of social networks by some political actors and their negative effects - with practices such as coordinated actions and inauthentic influence. As examples, it cites political violence in Mexico in 2021, the scale of social polarization in Peru after the 2021 elections, and the insurrection in Brasilia in January 2023. "All these events share a common element, and that is that they were preceded by disinformation campaigns and escalated by discourses that condoned the use of violence against people and institutions of the rule of law," the document argues.

The Office of the Special Rapporteur suggests that political parties should encourage quality public debate and take measures to avoid this type of behavior, which affects the legitimacy of institutions: "It is crucial that political parties refrain from promoting, by themselves or through third parties, campaigns of disinformation, discrimination, hatred and intolerance.

Between urgent solutions and byzantine discussions

These concerns have led to regulatory projects that turn out to be much more inconvenient than the problems they are intended to solve. This is the case of initiatives that, for example, seek to attack disinformation, but end up endorsing censorship mechanisms. For the Office of the Special Rapporteur, urgent solutions of this type overlook the complexity of digital challenges and undermine the possibility of creating regional agreements.

Regarding content moderation, the organization's recommendations point to a regulation that respects as much as possible the guarantees for freedom of expression that have been developed in the Inter-American system. A legitimate and well-intentioned postulate, but one that quickly runs up against practical questions.

On the one hand, the Rapporteurship rightly emphasizes the need to strengthen transparency and accountability. The report suggests that content moderation should have adequate notifications to users who are affected by sanctions, offer sufficient appeal mechanisms and provide effective remedies for those who have been unfairly sanctioned. These remedies can range from the reversal of sanctions to public apologies or compensation. In the words of the Rapporteurship, "the responsibility of intermediaries must be aligned with respect for due process of law".

On the same point, the report highlights the need for platforms' community standards to be available in all languages, including indigenous languages, and to be accessible to people with disabilities. It also suggests that platforms publish reports on high-impact events, such as protests, elections or armed conflicts, as well as information on government requests to remove content.

On the other hand, the Rapporteurship emphasizes the need not to make concessions when protecting freedom of expression under Inter-American standards: "content that is manifestly illegal and violates human rights may be removed by intermediaries under strict conditions that respect the principles of legality, necessity and proportionality".

In attempting to adapt this Inter-American standard - known as the tripartite test - to content moderation, the Rapporteurship fails to advance and clarify the practical dilemmas faced by digital platforms. In light of these approaches, these intermediaries would have practically no margin to arbitrate their spaces and combat phenomena such as coordinated operations, unwanted interaction, harassment, spam and even explicit content.

Without ignoring the progress previously mentioned, this position of the Rapporteurship is based on the fact of equating certain moderation mechanisms with censorship, despite the fact that in the global debate nuances have appeared in this assessment. Although in principle it has been sought to establish analogies between this intervention of public debate and judicial mechanisms, the approach has proven to be incompatible with the realities of social networks, taking into account the massiveness of the content that circulates online.

The academic Evelyn Douek has proposed leaving behind this analogy to conceive moderation as a massive project of freedom of expression that is based more on systematic decisions and ex ante mechanisms than on individual situations. A vision similar to that of Meta Advisory Board member Catalina Botero, who speaks of focusing on the aggregation and amplification of problematic content.

"Content moderation inevitably involves trade-offs between competing rights and interests," argues academic Barrie Sander."

The biggest challenge, however, lies in translating general human rights principles into specific rules, processes and procedures, tailored to the context of platform moderation."

Dylan Moses, a researcher at the Integrity Institute, has described the differences between moderation and censorship in terms of their objectives and the consequences for users. For example, while censorship seeks to limit speech opposed to official speech, moderation - as the Commission itself puts it - is a tool to minimize risks, especially against marginalized populations.

For Javier Pallero, activist and expert in human rights and technology, beyond these questions, the report makes a great contribution and advances the conversation: "Perhaps it would be interesting to study how we come to define the concepts and establish agreements in more precise ways, because so far we are relying on concepts that are institutional, that respond to institutions and historical elements that have their baggage, that are appropriate in some respects and in others seem inappropriate or insufficient.

This article originally appeared in Botando Corriente, our newsletter. You can subscribe here:‍
By:
go home