Recommendation systems favoring child sexual exploitation

8 minutes
6/22/2023
Recommendation systems favoring child sexual exploitation
"Silhouette of teenagers in front of a bright screen and a dark background", interpreted by Adobe Firefly.

For years, rules and protocols have been designed to combat the dissemination of child sexual exploitation content on the Internet. These mechanisms, which are structured on national and international laws, have focused primarily on containing and detecting networks that traffic images and videos of minors, harass them online or seek to conduct face-to-face encounters. However, in recent years a new phenomenon has been added to the equation: content produced and offered for sale directly by minors.

Recent research by Stanford University's Internet Observatory identified a network of more than 400 Instagram accounts, presumably run by minors, offering child sexual content produced by minors. The study also found 128 accounts with similar activity on Twitter.

According to the document, these accounts offer a menu of prohibited content that not only includes explicit material, but also self-harm, sale of sexual services and zoophilia. Given the age of the sellers and their lack of banking, the transactions are being made through gift cards and G2G, a platform for buying and selling digital goods.

The trade and disclosure of this type of material, in addition to constituting crimes, are prohibited by the platforms' policies. However, the research showed that users can easily access these publications. For example, when searching for keywords and hashtags related to child sexual exploitation content on Instagram, it is possible to find the material and the accounts that disseminate it by simply dodging a warning message that is displayed over the results.

On the other hand, the research shows how these profiles use ephemeral publications, such as stories, to offer their content.

Beyond moderation failures, which exist both on this topic and on other content prohibited by the platforms' rules, the reason these networks find a more prone environment on Instagram and Twitter lies in design decisions.

The researchers found that the recommended accounts function of these two social media favors those seeking child sexual abuse content. Thus, when a user finds a profile that engages in posting such material on Instagram or Twitter, the recommendation systems of these platforms offer the possibility of following analogous accounts.

It is this same reason that explains why, despite its popularity among children and teenagers, TikTok is not a particularly useful social network for these activities. According to the research, this platform's recommendation system is not designed to build communities, but rather to distribute content, which makes it difficult for users to find this material intentionally. In addition, searches for keywords and hashtags did not yield relevant results.

The investigation also included the study of spaces such as Telegram and Discord, where it is suspected that these distribution networks may take place, as several accounts identified on Instagram directed to links to groups on these platforms. However, as these were private communities, the decision was made not to enter and to report them to the NMEC (National Center for Missing and Exploited Children), the child abuse reporting center in the United States.

On the other hand, platforms such as Mastodon and Facebook are less attractive for the promotion of these contents. The former, due to limitations in its structure, such as the absence of direct messages and its decentralized nature, in which servers with lower moderation standards end up relegated and are difficult to access. The second is due to its lack of popularity among younger users and its policy of real names, which prevents the use of pseudonyms to engage in this type of crime.

Sexual content produced by minors is a growing problem globally. According to the Internet Watch Foundation, the pandemic has triggered this phenomenon, as the possibilities for sex offenders to commit abuse in offline life have been reduced and the demand for images of minors on the Internet has increased. In 2020, the year of confinements, reports of self-generated material registered a 77% increase over the previous year.

According to figures from Te Protejo, a reporting line for the protection of minors in Colombia developed by the organization Red PaPaz, in 2021 more than 14,600 reports of child sexual exploitation material were registered, of which 61% corresponded to content produced by minors.

That same year, research carried out by Viguías, Red PaPaz's safe Internet center, identified that among the main vulnerability factors that lead children and adolescents to produce this kind of content are symptoms of depression and access to the Internet for prolonged and unsupervised periods of time. In turn, a global threat report by the WeProtect alliance suggests that poverty may influence the increase in the trade of sexual content produced by minors.

After the Stanford Internet Observatory research was published, Meta, Instagram's parent company, assured that it would establish an internal task force to prevent its recommendation systems from favoring the growth of these networks.

According to the researchers, in addition to resolving these negative incentives in the design of the platforms, there is a need for coordinated work between Internet companies to detect signals, such as publications that offer material through links to other social media or the activity of overage users who constantly contact minors.

Regarding the treatment of accounts of children and adolescents who offer this content, the report draws attention to the need to apply the corresponding suspensions, which should be accompanied by resources that discourage recidivism, such as warnings of legal consequences, the risks of being a victim of sexual extortion or harassment, and links to support centers.

This article originally appeared in Botando Corriente, our newsletter. You can subscribe here:‍

By:
go home