Class Action: TikTok Failed to Mitigate Psychological Harm to Content Moderators from Exposure to Graphic Content [UPDATE]
by Erin Shaak
Last Updated on November 17, 2023
Young et al. v. ByteDance Inc. et al.
Filed: March 24, 2022 ◆§ 3:22-cv-01883
A class action alleges ByteDance and TikTok failed to protect content moderators from psychological trauma stemming from persistent exposure to graphic content.
California Labor Code California Business and Professions Code California Unfair Competition Law
California
November 17, 2023 – Certain Claims in TikTok Content Moderators Class Action Sent to Arbitration
A federal judge has ordered one plaintiff in the proposed class action detailed on this page to take their claims before an arbitrator, upholding the arbitration agreement the consumer signed with her employer, Telus International, which employed her to moderate content for TikTok as an independent contractor.
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
In an order issued on October 26, 2023, United States District Judge Vince Chhabria granted the defendants’ June 2023 motion to send to arbitration the allegations of one of two plaintiffs that were outlined in an amended complaint filed in November of last year.
According to the 10-page order, the judge agreed with TikTok’s argument that “it would be unfair for [the plaintiff] to avoid the consequences of an arbitration agreement covering the conditions of her employment by suing a third party, and not her employer.”
The judge deferred to Nevada law, which “prevents [the plaintiff] from avoiding arbitration in this case,” because the woman’s complaint “‘raises allegations of substantially interdependent and concerted misconduct’ by both the defendant, who did not sign the agreement, and a contracting party who did.”
In addition, Judge Chhabria disagreed with the plaintiff’s argument that the arbitration agreement is “procedurally and substantively unconscionable” because it was offered to her on a “take-it-or-leave-it basis.” The judge ruled that the arbitration provisions were binding and enforceable.
Per the judge’s ruling, the plaintiff’s claims were dismissed without prejudice. The claims of the other named plaintiff are still being litigated.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
A proposed class action lawsuit alleges ByteDance Inc. and TikTok Inc. have failed to protect content moderators from debilitating psychological trauma and trauma-related disorders stemming from persistent exposure to graphic content posted on the short-form video app each day.
The 30-page lawsuit says that although ByteDance and TikTok are well aware of the negative psychological effects linked to exposure to the graphic and objectionable content flagged for review on the app—including images of abuse, rape, torture, bestiality, beheadings, suicide and murder—the companies have nevertheless failed to provide proper mental health support to prepare moderators for the “horrific” content they’ll see and mitigate the harm from reviewing such images.
“By requiring content moderators to review high volumes of graphic and objectionable content, Defendants require content moderators to engage in abnormally dangerous activities,” the complaint contests. “By failing to implement acknowledged best practices to mitigate risks necessarily caused by such work, TikTok violates California law.”
Moreover, the lawsuit claims that the defendants have “exacerbate[d] the harm” caused to content moderators, who in addition to graphic videos are also exposed regularly to misinformation, hate speech and conspiracy theories, by requiring them to sign non-disclosure agreements that “forc[e] them to keep inside the horrific things they see while reviewing content for Defendants.”
The lawsuit explains that TikTok hires content moderators to remove videos that violate its terms of use. Per the case, TikTok users report inappropriate content that’s posted on the social media app and the moderators then review the content, sometimes thousands of videos and images per day, and remove the material that violates the app’s policies.
According to the suit, content moderators are expected to review massive volumes of content each day, typically spending no more than 25 seconds on each video and viewing multiple videos at a time to meet the defendants’ “oppressive quotas.” The lawsuit says that in the second quarter of 2021 alone, more than 8.1 million videos were removed from TikTok, the vast majority of which were taken down by human content moderators.
The lawsuit goes on to state that it is “well known” that exposure to graphic images and videos such as the content reviewed by TikTok’s moderators can cause “debilitating injuries,” including post-traumatic stress disorder (PTSD), anxiety and depression. Psychological trauma, the suit goes on, can also cause a range of physical symptoms, including extreme fatigue, dissociation, difficulty sleeping, excessive weight gain, anxiety and nausea, and other digestive issues. The case stresses that it is critical that those who develop PTSD or other mental health conditions following exposure to trauma receive preventative measures and treatment.
The lawsuit alleges that although TikTok is well aware of the psychological damage that could be caused by exposure to the content flagged for review on the app, it has failed to meet industry standards for mitigating the harm to content moderators. Per the suit, the defendants are members of the Technology Coalition, who published a guidebook on how to help employees handle child sexual abuse images. Though the guidebook specifies how to develop a “robust, formal ‘resilience’ program” to support the well-being of content moderators, the defendants have failed to implement any of those standards, the case alleges.
Indeed, though many other companies have taken “notable measures” to minimize harm to their own content moderators, such as using filtering technology to distort images and providing mandatory psychological counseling, ByteDance and TikTok have instead implemented productivity standards and quotas that are “irreconcilable with applicable standards of care,” the lawsuit claims.
“Defendants push content moderators to watch as many videos as possible despite the potential harms to moderators’ psyches,” the complaint reads. “Defendants are aware, or should have been aware, that their harsh requirements create an increased risk that content moderators will develop PTSD and related disorders. Despite this awareness Defendants failed to provide adequate services to content moderators, including Plaintiffs, to cope with the unbearable health concerns that are a result of Defendants’ policies.”
Per the suit, TikTok content monitors have experienced “immense stress and psychological harm” as a result of the “constant and unmitigated exposure to highly toxic and extremely disturbing images” in the workplace.
The case looks to represent anyone in the U.S. who performed content moderation work for or in relation to ByteDance’s TikTok application.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.