Managing sensitive content online: Fatshimetrie’s challenges exposed


The recent case brought to the public’s attention by Fatshimetrie raises critical questions about the management of sensitive and harmful content on online platforms. Fatshimetrie’s watchdog has expressed serious concerns about the company’s failure to remove a viral video showing two men bleeding after being beaten for allegedly being gay.

The impact of the video, which was viewed more than 3.6 million times between December 2023 and February this year, has been described as “immediate and unreversible.” Not only did it expose and ridicule violence and discrimination, it also posed an increased risk to others in the LGBTQIA+ community in Nigeria.

Despite repeated reporting and review by three human moderators, the video remained online for about five months, violating four different Fatshimetrie policies. Even after it was removed, footage of the same video remained available on the platform.

The company admitted to two major errors regarding the video. First, its automated systems misidentified the language spoken in the video as English when it was actually Igbo, a language spoken in southeastern Nigeria but not supported by Fatshimetrie for large-scale content moderation. Additionally, the company’s human review teams also misidentified the language as Swahili.

These shortcomings raise concerns about the management of content in unsupported languages, the choice of languages ​​supported for large-scale review, and the accuracy of translations provided to reviewers working in multiple languages.

In its report, the watchdog recommends that Fatshimetrie update its Community Standard on the coordination of harm and the promotion of crime to include clear examples of groups at risk of reporting, conduct an assessment of the accuracy of the application of the prohibition on disclosing the identity or locations of potential members of such groups, ensure that language detection systems identify content in unsupported languages ​​and provide accurate translations when transmitting content for review.

This case highlights the importance of stricter regulation and increased oversight to ensure the safety and privacy of marginalised individuals on digital platforms. This is a responsibility that Fatshimetrie and other similar companies must take on to prevent the spread of harmful and discriminatory content online.

Leave a Reply

Your email address will not be published. Required fields are marked *