Lack of Safety Protocols Exacerbates Psychological Harm to TikTok Content Moderators, Class Action Alleges
Frazier v. ByteDance Inc. et al.
Filed: December 23, 2021 ◆§ 2:21-cv-09913
A TikTok content moderator alleges in a class action that the app and parent co. ByteDance have failed to provide a safe work environment to thousands of contractors.
A TikTok content moderator alleges in a proposed class action that the app and parent company ByteDance have failed to provide a safe work environment to thousands of contractors responsible for ensuring the short-form video platform is as safe as possible for users.
In a 36-page lawsuit, the plaintiff, a Las Vegas resident and Telus International employee, claims to have suffered significant psychological trauma, including anxiety, depression and post-traumatic stress disorder (PTSD), as a result of “constant and unmitigated” exposure to highly toxic, extremely disturbing images in the workplace. The lawsuit alleges ByteDance and TikTok are aware that content moderators experience psychological trauma in the course of their jobs yet have not implemented safety standards found commonly throughout the industry to protect the workers.
Moreover, the complaint charges that the non-disclosure agreements imposed by ByteDance and TikTok upon content moderators, who spend hours each day ensuring graphic and objectionable content such as child sexual abuse, murder, suicide, genocide, rape, animal cruelty, conspiracy theories and political disinformation does not make its way onto the platform, have only exacerbated the harm the workers have experienced. The lawsuit claims that without the court’s intervention, content moderators will continue to experience harm in the course of their jobs with ByteDance and TikTok.
“ByteDance and TikTok failed to implement workplace safety standards,” the lawsuit alleges. “Instead, they requires [sic] their Content Moderators to work under conditions they know cause and exacerbate psychological trauma.”
Rather than scrutinize content before it is uploaded, ByteDance and TikTok rely on users to report inappropriate videos, the case says. Human moderators are then tasked with combing through millions of reports of possibly objectionable content, sometimes “thousands of videos and images each shift,” and removing the content found to violate the defendants’ terms of use, the lawsuit relays.
According to the suit, each reported TikTok video is sent to two content moderators, who review the video and determine whether it should be removed. Due to the sheer volume of content, moderators are permitted no more than 25 seconds to view each video, and simultaneously view three to 10 videos at once, the case says.
ByteDance and TikTok are members of the Technology Coalition, a group that includes Facebook, YouTube, Snap Inc., Google and other companies with content moderation challenges, the lawsuit states. The Technology Coalition Guidebook recommends that internet companies implement a robust, formal “resilience” program to support content moderators’ well-being and mitigate the traumatic effects of exposure to objectionable imagery, according to the case.
The lawsuit contends that ByteDance and TikTok have failed to implement workplace safety measures that meet standards within the industry and those set by the Technology Coalition, the suit alleges. For one, the defendants do not ask content moderators about their previous experience in dealing with graphic content, or ensure that an individual is told the content can have a significant negative mental health impact, the case says. Moreover, ByteDance and TikTok do not require content moderators to be trained on how to address their own reactions to graphic images, per the complaint.
Due to productivity quotas, content moderators are expected to have an accuracy rate of 80 percent when reviewing content reported by TikTok users, the lawsuit says.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.