The sensitivity of talking about suicide and self-injury on social media

5/27/2022
The sensitivity of talking about suicide and self-injury on social media

On May 15, the Twitter hashtag #RazonesParaSuicidarse became a trend in Colombia. Far from what it might seem at first glance, the hashtag referred to a column by journalist Daniel Samper Ospina. The text bore the same title and was a satire on current political events that alluded to the recent decision taken by the Constitutional Court to decriminalize assisted suicide.

After offering apologies, Samper stopped promoting the hashtag . However, the episode reactivated the conversation about the importance of knowing how to talk about mental health issues with some sensitivity and reminded how difficult it is for civil society and media to deal with this topic responsibly. That is, without an approach that could hurt others or put them in danger.

The role of social media in the dramatic increase in suicide rates or suicide attempts has been pointed out for many years. Different studies have indicated that users, especially younger users, have been exposed to problematic online content related to suicide, such as notes, instructions, pacts and methods.

The platforms have even been sued by parents who allege that the suicides of their children were due to the negligence of these companies in the design of their functions and the lack of barriers to prevent harmful content from reaching users' screens.

As for other sensitive issues, social media have developed mechanisms to deal with this problem, beyond removing posts around suicides. For example, Twitter' s policy on this matter has three purposes: to offer support to those who express suicidal thoughts on the platform, to sanction those who promote this behavior, and to protect other users from the potential harm that could be caused by exposure to this kind of content.

Twitter bans posts that incite a person to commit suicide, invite participation in group suicides or suicide games, or share information or instructions for suicide or self-harm. In such cases, the platform will ask users to delete the tweet and block access to the account for a period of time. In case of repeated non-compliance, the platform may suspend the account permanently.

While the bans refer to specific situations, certain multimedia content that refers to these topics - such as links, videos or images - may receive a tag to alert other users that the content is "potentially dangerous". In this way, people can decide whether they want to expose themselves to the post.

Because of its huge success among younger audiences, TikTok has also been forced to develop policies to curb this type of content. In 2020, it had serious problems to stop the dissemination of a video that went viral and in which he appeared hurting himself.

As with Twitter, TikTok's community standards prohibit content that encourages, exalts or normalizes participation in suicidal behavior or self-harm, as well as challenges, pacts or jokes around these topics. In addition, TikTok sanctions those who share false information or myths about self-harm.

Of course, this does not mean that all conversations about suicide will be sanctioned. Both Twitter and TikTok offer themselves as a place for those who want to share personal stories or ways of coping with these situations.

Together with mental health experts, TikTok has developed a guide with recommendations for users who intend to talk about these topics. Among others, the company suggests not to give details about methods, places or suicide notes. This kind of information, although common in some cases, such as those in which the suicide of a celebrity is discussed, can be dangerous for other users. 

In the event that users detect suicidal behavior or thoughts on the part of someone, the platforms offer help channels. Twitter, for example, has a special form to report suicide threats or forms of self-harm. According to the company, in case it receives a warning, it will contact the affected person, encourage them to seek support and refer them to resources and support lines.

TikTok, for its part, ensures that in case it finds that a member of its community is at risk of harm, the company will be able to contact local emergency services. TikTok also has a directory of support lines for different countries.

By:
go home