Class Action Says Court Must Intervene to Protect YouTube Content Moderators from Psychological Trauma [UPDATE]
Last Updated on December 5, 2022
Doe v. YouTube, Inc.
Filed: October 24, 2020 ◆§ 4:20-cv-07493
A class action alleges YouTube has negligently failed to protect content moderators from the psychological trauma associated with repeatedly viewing graphic and objectionable images.
November 29, 2022 – YouTube Content Moderator Trauma Settlement Website Is Live
The official website for the $4.3 million YouTube content moderator class action is live and can be found here:
www.contentmoderatorytsettlement.com
The website states that current and former YouTube content moderator employees and subcontractors who received notice of the settlement, i.e., the covered “class members,” do not need to do anything to receive money from the deal. Payments will automatically be sent to class members who do not request to be excluded from the settlement.
To contact the settlement administrator, head to this page. Answers to frequently asked questions about the deal can be found here.
A final approval hearing is scheduled for February 21, 2023.
Don’t miss out on settlement news like this. Sign up for ClassAction.org’s free weekly newsletter here.
August 2, 2022 – YouTube Agrees to $4.3 Million Settlement in Content Moderator Trauma Class Action
YouTube has agreed to pay nearly $4.3 million to settle the proposed class action detailed on this page.
A 23-page motion for preliminary settlement approval submitted on July 12, 2022 states that the deal, which now heads to a judge for preliminary approval, was reached through extensive arms’-length negotiations and with the help of experts in the diagnosis and treatment of trauma-related injuries.
If approved, the settlement would cover an estimated 1,300 individuals who performed content moderation work for contractors of YouTube in the United States at any time between January 1, 2016 and the date the deal receives preliminary approval. Court documents state that each “class member” is estimated to receive $2,079 from the settlement.
As part of the settlement, YouTube has also agreed to provide content moderators with on-site access to counseling services provided by a licensed and experienced clinician for individual biweekly sessions of at least 45 minutes, with additional counseling available on an as-needed basis. YouTube has also agreed to provide access to telephone counseling, critical incident response and monthly peer support groups, as well as a prohibition on making adverse employment decisions based on a content moderator’s use of the services.
A preliminary approval hearing is scheduled for August 16, 2022.
Don’t miss out on settlement news like this. Sign up for ClassAction.org’s free weekly newsletter here.
A Jane Doe plaintiff has filed a proposed class action against YouTube over what she alleges is the Google-owned streaming giant’s failure to implement workplace safety standards to protect content moderators from psychological trauma resulting from frequent exposure to graphic and objectionable videos.
As a result of content moderators’ unmitigated exposure to “highly toxic and extremely disturbing images” through YouTube’s proprietary “single review tool,” the workers have developed and suffered from “significant psychological trauma,” including anxiety, depression and symptoms associated with post traumatic stress disorder (PTSD), the 46-page lawsuit, filed September 21 in San Mateo County Court, alleges.
Echoing a suit filed against Facebook in March 2020, the plaintiff argues that while YouTube has aimed to bolster its public image by drafting workplace safety standards to mitigate the psychological harm caused by viewing graphic material day in and day out, the company has nevertheless failed to put these measures aimed at protecting content moderators’ mental health into practice.
“Instead, the multibillion-dollar corporation affirmatively requires its Content Moderators to work under conditions it knows cause and exacerbate psychological trauma,” the lawsuit says, alleging that such conditions violate California law.
The case emphasizes that while the aforementioned safety standards might not eliminate altogether the risk of content moderators developing job-related psychological disorders, the standards could at least reduce the risk and mitigate any harm.
“By requiring its Content Moderators to review graphic and objectionable content, YouTube requires Content Moderators to engage in an abnormally dangerous activity,” the lawsuit says. “And by failing to implement the workplace safety standards it helped develop, YouTube violates California law.”
Still further, the suit alleges the non-disclosure agreements to which YouTube requires content moderators to agree “exacerbates” the harm experienced by the workers.
According to the lawsuit, YouTube content moderators, tasked with keeping disturbing content—described in the complaint as millions of uploads that include “graphic and objectionable content such as child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder”—off of the streaming platform, are exposed every day to not only videos of extreme and graphic violence and assault but repeatedly to “conspiracy theories, fringe beliefs, and political disinformation,” including content related to Holocaust denials, COVID-19 hoaxes and doctored videos of elected officials.
“This type of content has destabilized society and often features objectionable content,” the suit says.
Per the complaint, YouTube content moderators, of which there are thousands across the country, are required to view “hundreds of thousands if not millions” of potentially rule-breaking videos uploaded to the platform each week. YouTube relies on users to “flag” inappropriate content before it comes before the eyes of a moderator, the suit relays.
The plaintiff goes on to allege that YouTube, during the hiring process for prospective content moderators, failed to properly inform the individuals about the nature of the work or the effect reviewing graphic content could have on their mental health. Though prospective moderators are told they might have to review graphic content, they’re neither provided examples of such nor told that they would be required to view it daily, the suit claims. Prospective hires are also never asked about their experience in viewing graphic content or told it could have negative consequences with regard to mental health, the lawsuit says, stating training for content moderators begins only after they sign a non-disclosure agreement.
As the lawsuit tells it, the training process falls short of adequately preparing YouTube content moderators for the hazards of the position and for the job itself. From the complaint:
“During the training process, YouTube failed to train Content Moderators on how to assess their own reaction to the images, and YouTube failed to ease Content Moderators into review of graphic content through controlled exposure with a seasoned team member followed by counseling sessions.
Instead, Content Moderators are provided a two-week training where an instructor presents PowerPoints created by YouTube. The PowerPoints covered various categories of content, including graphic violence, child abuse, dangerous organizations, solicitation, porn, animal abuse, regulated products, fraud, and spam. Each category was covered by 60–80 slides. For each category, the PowerPoint began with a brief description of the applicable Community Guidelines, and then dozens of examples of content, applying the Community Guidelines.”
During training, “little to no time” is spent on wellness and resiliency, the case says. According to the suit, many YouTube content moderators, who while on the job are subject to strict quantity and accuracy quotas, remain in the position for less than a year due to “low wages, short-term contracts, and the trauma associated with the work.”
The plaintiff asks the court for, among other damages, medical monitoring for YouTube content moderators that provides specialized screening, assessment and treatment not generally given to the public at large.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.