Last week, a video posted on TikTok by El Ocurrente Victor, a content creator with more than 5.5 million followers on that social network, became popular. In the publication, Victor tested an effect or filter that shows users, as a game, the house they will have in the future. In a first attempt, the tool resulted in a shack. The influencer, who is Afro-Dominican, said: "I think that's because I'm black".
The video then shows Victor covering his face with flour and retesting the filter, which a second time around predicts he will live in the White House. The description accompanying the video reads, "These filters are racist."
The publication, which today has more than 11 million reproductions, is part of a trend that has been joined by Afro users to whom the effect has assigned houses in ruins, tribal constructions or caves.
TikTok, like other applications, allows users to create their own filters and effects through the Effect House tool. In turn, these elements work within the platform as vehicles to agglomerate content and create trends, in the same way as hashtags or audios.
The effect, which has been used in more than half a million videos on TikTok, is called "Your Future House" and was developed by Skill Sewa, a remodeling and housing solutions company located in Nepal.
TikTok's effects rules prohibit content that promotes stereotypes based on a person's race, sexual orientation, gender identity or disability. According to the company, these kinds of tools can cause prejudice and discriminatory behavior, so they will be rejected or removed from the platform, including those that generalize traits that are culturally perceived as negative.
The "Your Future House" effect is part of a classic design of TikTok tools: that of a carousel of images - of movie characters or soccer players, for example - that eventually stops to identify the user with just one of those images. This type of effect is usually developed under the 'randomizer' template, which in theory gives random results without taking into account the biometric information of the users, such as physical traits.
Although there are not enough transparency tools to know what kind of criteria the effects developed and published by users use, it is possible that the tendency of "Your Future House" and its alleged racist bias is more of a coincidence than a discriminatory bias.
In any case, this is not the first time that a racist tendency on the part of TikTok's algorithms has been noticed. In 2020, researcher Marc Faddoul, from the School of Information at the University of California at Berkeley, observed that the platform tended to recommend accounts according to people's ethnicity. In turn, that same year, a group of influencers denounced that their visibility had been affected after posting content related to the Black Lives Matter movement.
Situations of this type underline the broad responsibilities that the company has in the design of algorithms that today reach nearly 1.7 billion users. Added to this are the difficulties of monitoring and arbitrating the actions of third parties, not only on account of the content -audio, video, text- but also through additional layers of filters or effects that may end up having adverse impacts that the platform cannot foresee.