Digital Services Act: the European Union puts pressure on platforms

7/15/2022
Digital Services Act: the European Union puts pressure on platforms

In early July, the European Parliament passed the Digital Services Act (DSA), a package of rules aimed at controlling the power of social media platforms, with strict obligations on content moderation and transparency and heavy fines for non-compliance. According to European Commissioner Thierry Breton, this is the first time that a jurisdiction in the world has established comprehensive standards for the digital sphere. 

The news was received with enthusiasm by different civil society organizations and academics specialized in digital rights and platform regulation. To a large extent, the DSA satisfies claims that these sectors have had for years on their agenda. The law is expected to come into force in early 2023 and will apply to a market of 450 million users, so it is possible that its dynamics will resonate in the rest of the world.These are some of the DSA rules that could give a new face to the relationship between platforms, users and governments:

Transparency reports

Although several of the large platforms periodically publish information about the content they remove, such as Meta, Twitter, YouTube and Tiktok, until now such reports had been made under the companies' own considerations, motivated by the reputational and regulatory risk involved in not providing information about the application of their content policies on government requests to remove some publications. 

In addition, the DSA refers in general to intermediaries, beyond social media. On account of this rule, internet service providers, domain registrars and hosting services will also be required to submit reports on any content moderation activity carried out during one year. 

Risk assessment

The DSA imposes further obligations on platforms with more than 45 million users, which the law calls "very large online platforms". One of these obligations is to make an annual assessment of the risks of the use of their services and their operation, and plans to mitigate them. Among the risks that must be assessed are the dissemination of illegal content, negative effects on the exercise of fundamental rights such as freedom of expression, and manipulation of the platform through coordinated harmful activities that may affect public health, minors, civic discourse or public safety. 

Recommendation systems

In their terms of service, the major platforms should establish the parameters used in their recommendation systems, i.e., the guidelines used to determine the information presented to users, both in news feeds and other sections. If platforms have several options available, they should allow their users to select or modify them according to their preference.

This is one of the most striking points of the law, as recommendation systems, especially those based on interaction, have been questioned and pointed out as causing polarization in social media. However, it is not clear how the DSA could enter into regulating the use of Artificial Intelligence, which is playing an increasingly important role in recommendations and content discovery.

Online due process

The DSA mandates that platforms have internal mechanisms for users to file complaints when their content is removed and when their accounts are suspended. In general, social media companies offer the possibility to appeal these kinds of sanctions, however, their systems often fail: users do not understand the reasons why they are sanctioned or their appeals receive no response

On more than one occasion, Meta's Oversight Board - a body that functions as a Supreme Court of that company - has asked the platform to be more careful with its processes so that users are properly notified and have recourse to complain when they consider a decision to be unfair. Despite external criticism and demands, there have not yet been sufficient safeguards in this area. The DSA will elevate to a legal standard issues that platforms have attempted to resolve on their own by way of self-regulation. 

Trusted flaggers and state agencies

The law will adopt a system of trusted flaggers . That is, entities with sufficient experience and competence to detect and report online content for the purpose of its removal. Publications reported by these allies will be prioritized and will have special attention channels. 

This is a model that has previously been applied by some platforms, which work hand in hand with organizations in their moderation activities. However, various civil society groups have expressed their concerns, as the DSA envisages that authorities in European Union countries may acquire this status. The issue is especially relevant in countries where human rights indices are not the highest - such as Poland or Hungary - where police or surveillance agencies designated as trusted markers could abuse their powers.

All remains to be seen

Although the approval of the DSA has been celebrated and taken as a milestone in the progress of digital rights, many have expressed reservations. It will be the European Commission itself that will enforce compliance with the law and a European Center for Algorithmic Transparency will be set up comprised of data scientists and specialists to help monitor the implementation of the rule. However, it is not clear that the body will have sufficient capacity to enforce the provisions approved by the Parliament. 

The academic Daphne Keller has pointed out that while platform transparency exercises have always had a margin for error, the law expects rational, clear and rigid results, achieved through detailed processes and documentation. It is possible, in this way, that the requirements of the law will soon run up against the constraints of reality. Moreover, although the law makes distinctions between large and small platforms, the latter are left with a burden that could be very difficult to bear. 

This problem could aggravate one of the main concerns when talking about regulation: the concentration that some companies have achieved in the market. By imposing more conditions on smaller companies, it could limit their growth and strengthen the power of the larger ones. 

On the other hand, there is the role that government agencies will play in the near future. For Cristoph Schmon, international director of public policy at the Electronic Frontier Foundation, the doors opened to them by the law may lead to a highly politicized model and cause new problems. "Respect for the European Union's charter of fundamental rights and the inclusion of civil society groups and researchers will be vital to ensure that the DSA becomes a positive model for legislation outside the European Union," he said in an EEF statement on the law's passage. 

There is one last element: the impact that this new regulation will have outside Europe. The world is watching as a spectator this exercise to bring platforms to the order of users' rights and institutions' vigilance. The quality of service for users in other territories also depends on the way in which social media comply with the DSA. If social networking companies standardize the transparency and moderation practices that will result from this new law, the effect will be felt in other regions where these companies have not fully set their sights, such as Latin America. Otherwise, we can only expect that the gaps will continue to widen and that, once again, the region will be left behind in the global conversation.

By:
go home