No one is denied a copyright claim. No matter how cautiously content creators act to avoid copyright problems, no matter how much they take care of themselves and use only a few seconds of a song or a movie, no one is safe. Even if everything seems to be in order, platform systems often apply disproportionate penalties.
This is what happened to the Cine Club of the Central University of Bogota, which, forced by the confinement in 2020, had to start broadcasting its cycles through Facebook. Although they had the foresight to only screen public domain films to avoid problems of this type, on two occasions the platform suspended transmissions for alleged copyright infringement.
The way the system is structured and the consequences of these sanctions are serious problems for content creators, especially the smaller ones, as explained in the report 'Automatic Copyright Protection: A Tool of Inequality', published at the end of June by the Karisma Foundation.
Automatic copyright detection tools appeared in 2007, when Google introduced Content ID, a system that allows mass detection of copyright-protected content and offers tools to rights holders to act against infringers, being able to remove the content, keep its monetization or leave it online. The idea of Content ID has given rise to analogous systems adopted by major platforms such as Meta, Twitter and Twitch.
The mechanism has been criticized over the years for tilting the balance in favor of the alleged rights holders or their representatives -big record labels or associations representing the interests of an industry-, who can decide on the content of creators without verifying that they are indeed infringing copyrights.
On the other side are the users, who are automatically at the will of the owners, and therefore exposed to copyright sanctions. The measures apply even when creators act under fair use, a doctrine that allows them to use protected material without having to ask the rights holder for authorization.
This imbalance also opens the door to sanctions even when protected material has not been used. The Karisma Foundation report, prepared by journalist and researcher José Luis Peñarredonda, includes the case of Cuestión Pública, an independent media that during the 2019 National Strike in Colombia was sanctioned without having used any outside material.
It all happened during an interview with Congresswoman Ángela María Robledo via Facebook Live. Suddenly, the transmission was interrupted, allegedly for violating copyright regulations. Apparently, Robledo was using the same clothes and a similar framing for a previous interview she had given to Blu Radio, so Facebook's systems identified it as a copyright infringement.
In addition to these errors in the moderation systems, affected users encounter obstacles when trying to appeal the decision. Sometimes, they never receive a response. In other cases, they only manage to move forward and solve the problem if they contact the platform directly, an option that not everyone can access.
But appealing and reversing a sanction does not mean that things are completely corrected. Several users interviewed for the report stated that after the complaint the algorithms decreased their visibility, which affects their relationship with their audience and ultimately their economic viability.
The case of journalistic content is even more problematic. If a publication is removed due to copyright, the time it takes for the appeal and the final decision is enough for the news to lose its relevance, especially when other media have been able to disseminate the news without this kind of obstacles.
In their current state, the automatic detection tools, the report concludes, deter content creators and creatively limit them. Knowing the consequences of using protected material - or even public domain material - users prefer not to expose themselves to a sanction and its consequences.
*The full report can be consulted here.
*In Circuito we register unfair cases of moderation. Cases can be shared in this link.