TikTok targeted by complaint from former moderators traumatized by what they saw

Cruelty to animals, acts of torture, naked minors, corpses, this is what social network moderators are sometimes exposed to on a daily basis. TikTok is no exception. In the United States, two former moderators of the platform decided to file a complaint against a federal court on March 24, the media reveals npr.

The deplorable working conditions of small hands on the web

TikTok is a billion monthly active users for about 10,000 moderators around the world. These women and men are responsible for filtering and blocking content that contravenes the rules of use of the platform passed through the meshes of automated moderation.

In the same category

The Instagram app in the App Store.

Instagram: all users will soon be able to tag products

Two of them, employed by two TikTok subcontractors, accuse the platform and its parent company ByteDance of not having sufficiently protected them from the videos ” highly toxic and extremely disturbing to which they have been exposed.

They believe that the platform has not protected them from the risks of emotional trauma, anxiety, depression, post-traumatic stress, in violation of California labor law.

On the contrary, TikTok has chosen to prioritize the amount of video processed, imposing extremely demanding quotas: more than 80% accuracy in decisions made after a 25-second review of a video. A high threshold encouraging repeating the same extract several times to avoid errors.

The salary of the two plaintiffs was tied to their efficiency in handling content. Their days lasted twelve hours with two fifteen-minute breaks and one hour lunch break. The support offered elsewhere was singled out as very insufficient. The complaint also notes that a nondisclosure agreement prohibited employees from sharing what they were going through with loved ones.

TikTok is not the only social network accused of mistreating web 2.0 workers, essential for platforms. In 2020, Facebook agreed to a $52 million settlement from 11,000 of its moderators for these same reasons. The same lawyers who handled the class action against Facebook are in charge of this new case.

TikTok claims to do everything to protect its moderators

It turns out that the outsourcing companies for which the plaintiffs worked also collaborated with Google and Facebook. However, it seems that the latter two apply the recommendations of the National Center for Missing and Exploited Children.

This non-profit organization has come up with visual tricks to spare moderators as much as possible. TikTok, for reasons of efficiency, according to the complaint, would not enforce this advice.

Solicited by nprTikTok assures that the company “ strives to promote a caring work environment for its employees and contractors “. She explained that she set up “ a range of wellness services so moderators feel mentally and emotionally supported “. It is up to American justice to decide.