Kenya-based ex-moderator sues Facebook

Facebook is again facing, since May 10, a complaint from one of its former moderators, this time in Kenya. It also targets Sama, a subcontractor of Meta, dealing with the moderation of the social network in the countries of East and South Africa.

A traumatic experience

Daniel Motaung worked for 6 months with Sama as a content moderator for Facebook. He accuses the two companies of lying about their job titles to trick poor people in countries like Kenya, South Africa, Ethiopia, Somalia and Uganda.

In the same category

Photograph of the TikTok logo with a person holding their cellphone.  The 996 working method is controversial.

The daily life of TikTok employees is not easy

He specifies in an investigation of the Time never being told that his job would be to watch disturbing videos that could cause his mental health problems. ” The first video I saw was a live beheading “, reports the former moderator during a press conference organized by Real Facebook Oversight Board, an anti-Facebook association.

Since then, Daniel Motaung claims to suffer from post-traumatic syndrome. He adds : “ Imagine what it can do to a normal person if you then watch other similar videos and images and content every day. “.

A kind of “modern slavery”

Daniel Motaung’s attorneys point out that Meta and Sama ” recruit moderators through fraudulent and deceptive methods, constituting an abuse of power, exploiting the vulnerability of young, poor and desperate candidates “. According to them, these methods are the equivalent of a modern form of slavery prohibited by Article 30 of the Constitution [du Kenya] “. They also denounce difficult working conditions, insufficient psychological support and a derisory salary of 40,000 Kenyan shillings, or around 325 euros per month. The average monthly salary in Kenya is around 400€.

A spokesperson for Meta responded to theAFP. ” We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading salaries, benefits and support. We encourage moderators to speak up about issues when they arise, and we regularly conduct independent audits to ensure that our partners maintain high standards, in line with our expectations. »

Content moderation is an extremely difficult job exposing the employee to videos of murders, rapes, tortures, suicides. So many traumatic scenes that severely affect the mental health of moderators. This is a recurring problem for large digital platforms. They are accused of not putting enough measures in place to ensure the well-being of their employees. Two years ago, Facebook had already been ordered to pay $52 million to its moderators as compensation for the trauma suffered. TikTok has also been singled out for the same reasons.

ttn-4