Class Action: TikTok, YouTube Reporting Features Ineffective in Shielding Minors from Harmful Content like ‘Choking Challenge'
Bogard et al. v. TikTok Inc. et al.
Filed: February 1, 2023 ◆§ 3:23-cv-00012
A class action claims TikTok and YouTube do not have in place adequate reporting processes for removing harmful content, thereby exposing children to dangerous, harassing and sexually explicit material.
A proposed class action claims TikTok and YouTube do not have in place adequate reporting processes for flagging and removing harmful content, thereby exposing children to dangerous, harassing and sexually explicit material.
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
The 55-page lawsuit summarizes that although TikTok and YouTube purport to offer robust, meaningful tools for reporting harmful content, these procedures are, in fact, “automated, superficial, and inadequate” in protecting the platforms’ users, who effectively “waste their time and resources” voluntarily attempting to flag harmful content. According to the suit, TikTok and YouTube’s reporting features yield “arbitrary and contradictory decisions” and “canned responses” with no chance to appeal or dispute, and often “retraumatize[]” those who proactively sought to report dangerous content.
The complaint was filed on February 1 in Indiana federal court by three parents, two of whom have lost teenage children to the “choking challenge,” a fatal trend they mimicked after seeing it on YouTube, and one who says her child was the subject of a harassing TikTok video that went viral among his peers. The complaint was also filed by the Becca Schmill Foundation, a non-profit formed in memory of Rebecca Mann Schmill, who was 18 years old when she passed away in September 2020 from fentanyl poisoning after taking drugs she allegedly purchased over social media.
As the case tells it, the plaintiffs are among those who have channeled their grief into action by searching for and voluntarily reporting the “immense” number of harmful videos found on social media. The filing claims, however, that the plaintiffs’ reporting efforts have been “unheeded, ignored, and arbitrarily dismissed” by YouTube and TikTok, as well as parent companies Alphabet, XXVI Holdings, Google and ByteDance Inc.
“Plaintiffs soon found out that their reporting efforts were met by an arbitrary system of automated messages that contradict the Defendants’ platforms’ own policies,” the suit states. “In canned automated responses, Defendants informed Plaintiffs that the reported content does not violate the community guidelines of the respective platform and would thus remain on the platform.”
Alarmingly, a study commissioned by Becca Schmill Foundation found that less than five percent of TikTok and YouTube’s reporting features effectively addressed the reporting user’s concerns, the filing states.
For instance, the complaint says that for the past several years, one plaintiff has been reporting numerous choking challenge videos to YouTube in fear that they will encourage other children to attempt the trend, which has reportedly already contributed to 1,227 deaths and 144 injuries worldwide as of January 26 of this year. Each time the plaintiff has flagged a choking challenge video for featuring “Harmful and Dangerous Acts,” which YouTube claims are against community guidelines, she’s received an automated message promising that the platform will remove the material, “[i]f we find this content to be in violation of our Community Guidelines,” the case relays.
These assurances have proven to be empty, the lawsuit claims, as the reported videos remain watchable on YouTube, including one challenge video that has accrued 16,000 views over the past 11 years.
One of the plaintiffs’ sons has received multiple responses from TikTok that the sexually explicit content he has reported, some of which depicts child pornography, “[d]oes not violate community guidelines,” even though the platform claims to prohibit “Adult Nudity and Sexual Activities,” the filing says.
Although TikTok’s 2021 “#safertogether” safety campaign claimed that “every report that is sent to TikTok is checked,” and although YouTube’s safety policies promise that the platform applies its community guidelines “to everyone equally,” the case argues that the systems the platforms use to review reported content, both of which involve a combination of human moderators and artificial intelligence software, are “rife with inconsistent enforcement.”
The lawsuit says that when the plaintiff attempted to report a TikTok video featuring harassing comments against her son, an automated response from the platform said the video did not violate community guidelines. However, the suit relays, when a third-party advocate helped get the video reviewed by a TikTok executive, the representative confirmed that the video did, in fact, violate community guidelines.
The complaint notes that, unlike the plaintiff who had outside help, TikTok or YouTube users who have their reports dismissed have no way to appeal or seek clarifications regarding the reasons for these decisions.
“Upon repeatedly receiving these responses where Defendants’ platforms upheld the harmful contents that Plaintiffs reported, Plaintiffs felt helpless, retraumatized, invalidated, frustrated, insulted, and deceived,” the case reads. “They also suffered anxiety over the possibility that the harmful contents would continue to harm minor users in the way that it had harmed their children.”
The lawsuit looks to represent anyone who used TikTok and YouTube and made safety reports through the process set forth by the defendants since January 2020 and until the case is resolved.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.