Former TikTok moderators sue over mental distress from watching, removing graphic videos

BARGTEHEIDE, GERMANY - MAY 03: (BILD ZEITUNG OUT) In this photo illustration, a TikTok App in the IOS App Store on May 03, 2021 in Bargteheide, Germany. (Photo by Katja Knupper/Die Fotowerft/DeFodi Images via Getty Images)

Two former TikTok moderators have filed a class-action federal lawsuit against the social media company over the emotional toll they say was incurred from watching and removing disturbing videos.

Reece Young and Ashley Velez filed the paperwork last week in the U.S. District Court Northern District of California in San Francisco against the video-sharing app and its Beijing-based parent company, ByteDance.

According to court documents, the two women were hired as contractors to monitor, review and remove graphic and objectionable content that appeared on TikTok, such as videos that depicted child abuse, rape, torture, and brutality. 

The lawsuit claimed that the women witnessed several videos including a 13-year-old child being executed by cartel members. Other videos included bestiality, necrophilia and videos that supported conspiracy theories relating to the COVID-19 pandemic and the Holocaust. 

RELATED: Tech leaders would face prison if they don't follow UK internet safety rules

"We would see death and graphic, graphic pornography. I would see nude underage children every day," Velez said in an interview with National Public Radio. "I would see people get shot in the face, and another video of a kid getting beaten made me cry for two hours straight."

Attorneys for the women claimed their clients sought counseling after suffering "immense stress and psychological harm" while viewing the videos.

The women claimed that TikTok did not provide "adequate prophylactic measures" and care to protect moderators from harm. 

The women are suing for negligence.

FOX Television Stations has reached out to TikTok for comment. 

RELATED: States launch probe into TikTok’s effect on kids’ health

The lawsuit comes after Facebook was ordered to pay a $52 million class-action settlement on behalf of content moderators last year. The lawsuit alleged that those who performed content moderation work for Facebook were denied protection against severe psychological damage and other injuries that resulted from repeated exposure to graphic content, such as child sexual abuse, beheadings, terrorism, and animal cruelty.

As part of the lawsuit, Facebook had to make several changes to its work environment such as providing mental health resources, as well as enhancing review tools to make content moderators’ work safer. 

"This settlement provides immediate change, and real financial compensation for content moderators. We are very proud that we were able to work with Facebook to reach this result for the content moderators" said Steve Williams of the Joseph Saveri Law Firm, one of the lead counsel for the class.

This story was reported from Los Angeles. 







 

TechnologyU.S.NewsMental HealthSocial MediaTechnologyU.S.News