During the last months of 2020, when the U.S. presidential elections were advancing amidst theories of fraud, intimidation of election officials, attempts to delegitimize the process and new questions about social media as focal points of polarization, Facebook -now known as Meta- opened its doors for a group of researchers to experiment with algorithms and measure the true impact of its platforms on the elections.
Three years after the start of this project, entitled US 2020, the first results have begun to be revealed, which could contradict the ideas that have been settled in recent years about this phenomenon. Four papers published last week in the journals Science and Nature suggest a more limited role than has been attributed to Facebook and Instagram in fostering ideological segregation, consumption of fake news and exposure to problematic election-related content.
In one study, the home pages of thousands of Facebook and Instagram users were altered to expose them to information other than that with which they normally interacted, with the intention of assessing whether this change could affect their political opinions or mitigate polarization. In another, the visibility of viral content was limited, while in a third, the order of home pages became chronological, so that users were exposed to all the content available on their network without the intervention of the algorithm.
Although academia and civil society have suggested that such measures could combat online polarization and increase the quality of digital debate, the studies - which included data and survey research - do not seem to prove these hypotheses. According to the results, the changes did not affect users' political positions and, on the contrary, in the absence of systems to classify content, there was greater exposure to problematic posts, such as fake news or hate speech. In addition, by limiting the visibility of the most popular content on the platform, interaction was reduced and access to political information was diminished.
The findings reinforce the thesis of social networking companies, which have tried to alleviate the link between their recommendation systems and the radicalization of some users, under the premise that the quality of democracy began to decline years before the rise of social media.
Another research focused on the consumption of political news during the election season. According to this study, there is an ideological segregation in the way in which media information is received on social media. It was found, then, that in these spaces circulate many more links to conservative news portals, which at the same time have more warning labels applied by independent fact-checkers.
As noted by journalist Casey Newton, the findings should be read with a broader perspective, as Meta is just part of a much larger news ecosystem. Therefore, it cannot be taken as the only avenue by which users get information. While Meta was able to act to remove content that claimed fraud in the presidential election, the same ideas were still circulating on channels such as Fox News, Newsmax and other media outlets sympathetic to Donald Trump.
On the other hand, Frances Haugen - the former Facebook employee who in 2021 vented a series of malpractices within the company that would eventually become known as the Facebook Files- said that by the time the investigations were carried out, Meta had already implemented many of its policies to combat electoral disinformation. In addition, the studies focused on algorithms and the order of home pages, while the real problem was focused on Facebook groups.
For Katie Harbath, consultant and former director of public policy at Facebook, although the results undermine ideas that have made careers in recent years, they cannot be an excuse for Meta or other social networking companies to lower their guard in their efforts to protect electoral integrity.
In the past, other studies have also dismissed as causes of polarization the theories of echo chambers or filter bubbles, according to which social media isolate people in circles in which their own convictions are reproduced. However, this does not mean that polarization is unrelated to the nature of these digital spaces. According to an article published last year by Dutch academic Petter Törnberg, polarization is given by the partisan identities that emerge from interaction in social media and that users end up adopting when participating online.
While US 2020 offers new insights into the dynamics of interaction in times of elections, it is necessary to delve into other factors in which the infrastructure and uses of social media can affect democracy. This is the case, for example, of influence operations and the instrumentalization of these spaces by campaigns and political communication agencies, as exposed in a paper published earlier this week by a regional media alliance led by the Latin American Center for Investigative Journalism -CLIP- on the inauthentic strategies that have been deployed in countries such as Bolivia, Venezuela, Colombia, Brazil and Chile.
These first US 2020 studies come at a key moment for Meta and other social media companies, as they will soon be required to assess the systemic risks of their activities in light of the European Union's Digital Services Act, including their impact on elections. For editor Justin Hendrix and academic Paul M. Barret, the results could lead companies to underestimate the true impact of their platforms in the reports they must submit to regulators and the plans they must put in place to mitigate them.
The research is the result of a partnership involving both company members and independent academics, led by Talia Jomini Stroud, founder and director of the Center for Media Engagement at the University of Texas at Austin, and Joshua A. Tucker, co-director of the Center for Social Media and Politics at New York University. According to Nick Clegg, Meta's vice president of global affairs, the outside researchers worked independently and were not paid by the company. The course of the research was overseen by academic Michael W. Wagner, who served as rapporteur for the project and published a text in Science that accompanied the first published articles.
The US 2020 project has been taken in part as a gesture of transparency by Meta, as a social networking company has never before allowed this level of access to its systems. However, the limitations of the project have also been pointed out. For Wagner, no matter how much independence and access the researchers had to the company's databases, there was a marked asymmetry of information. Faced with the complex data and programming architecture, researchers were unaware of what they were missing, where they could dig, or what additional inputs they needed. In Wagner's words, in these cases, academics "don't know what they don't know."
Twelve additional articles will be published in the coming months to provide a better perspective on the impact of Meta platforms in electoral processes. The next installments will focus on the influence of political advertising, the promotion of ideological segregation, and the effects of inauthentic coordination.