Reddit Benefits from Allowing Child Pornography On Website, Class Action Alleges [DISMISSED]
by Erin Shaak
Last Updated on June 23, 2023
Doe v. Reddit, Inc.
Filed: April 22, 2021 ◆§ 8:21-cv-00768
A class action claims Reddit, Inc. has violated a federal law by knowingly profiting from the posting of child pornography on its website.
California Business and Professions Code California Unfair Competition Law Trafficking Victims Protection Reauthorization Act of 2008
California
June 23, 2023 – Reddit Child Porn Lawsuit Dismissed; Supreme Court Denies Petition to Review Case
The proposed class action detailed on this page was initially dismissed by a federal judge on October 7, 2021 before being officially tossed with prejudice on October 28 of that year after the plaintiffs failed to file an amended complaint.
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
In a 15-page order granting Reddit’s motion to dismiss, U.S. District Judge James V. Selna found that the plaintiff’s claims are barred by Section 230 of the Communications Decency Act, which states that “[n]o provider … of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
In addition, the judge determined that the plaintiffs failed to properly allege that Reddit knowingly assisted, supported or facilitated a sex trafficking venture in violation of federal sex trafficking laws, to which the immunity exemption does not apply.
On May 30, 2023, the U.S. Supreme Court denied the plaintiffs’ petition to review the case. The order states no reason as to why the high court has refused to grant a writ of certiorari.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
A proposed class action claims Reddit, Inc. has violated the Trafficking Victims Protection Reauthorization Act (TVPRA) by knowingly profiting from the posting of child pornography on its website.
The 45-page lawsuit alleges that Reddit, rather than address this “horrifying and pervasive trend,” has “taken virtually no action” to stop videos and images of minor victims from being posted on its popular website, which the company has described as “the front page of the internet.”
The complaint out of California says it was only after media reports in 2011 exposed the existence of child pornography on the defendant’s website that Reddit “begrudgingly” instituted a policy banning such. Nevertheless, Reddit, the suit alleges, has failed to enforce the policy and “continues to serve as a safe haven for such conduct.”
The lawsuit scathes that the defendant is motivated by greed and continues to “turn[] a blind eye” to portions of its website that the case says are “obviously geared toward child pornography” given the regulation of such would dampen profit-driving traffic.
“Reddit has chosen to prioritize its profits over the safety and welfare of children across the globe,” the complaint alleges, charging that Reddit’s conduct is “not only upsetting, it is illegal.”
Per the case, the defendant has run afoul of the TVPRA, a law that makes it unlawful to benefit financially from sex trafficking, which includes “any instance where a person under the age of 18 is caused to engage in a commercial sex act.” As the case tells it, “[t]hat is precisely what Reddit has done here—on an incredible scale.”
Reddit, according to the lawsuit, is the fourth most popular website in the U.S. and is made up of communities called “subreddits” where users can post stories, links and media that other users can “upvote” or “downvote.” The case notes that Reddit’s business model “embraces” user-generated uploads and “makes it easy for traffickers to share illegal content” by simply posting images and videos of underage victims.
“The entire process takes less than one minute,” the lawsuit alleges. “A user can post any video or image of any person doing anything without any consequences. The user does not have to demonstrate that he or she owns the copyrights in the content, that those depicted in the content have consented, or that those depicted in the content are of majority age.”
The lawsuit says Reddit has “no robust way” of verifying users’ ages; user age is “entirely self-reported” and is therefore “easily falsified,” the suit says.
While Reddit has access to view the content posted on its site, the defendant, according to the complaint, “famously refuses” to take down content that violates its policies because the company receives advertising revenue for maintaining “controversial yet popular content” on its subreddits. The case relays that some of the site’s most popular subreddits include r/LegalTeens, r/TooCuteForPorn, r/Female18, r/18nsfw, r/YoungNiceGirls, r/youngporn, r/TeenBeauties, and r/teensrdirty.
The plaintiff, who filed the suit pseudonymously, claims Reddit has refused to immediately remove videos depicting her in a sex act when she was 16 despite multiple requests that the company do so. According to the suit, the plaintiff’s then-boyfriend had filmed the two of them having sex without her knowledge or consent and then posted the videos on Reddit accompanied by “crude, disparaging, misogynistic and/or racist remarks.”
Though the plaintiff reported the videos to Reddit moderators, Reddit “would sometimes wait several days” before taking down the content, the lawsuit says. The plaintiff found that if she notified the moderators that there was a copyright issue in addition to the content containing child pornography, she received faster responses, according to the case.
“That is,” the complaint reads, “Reddit’s moderators appeared to care more about and respond more quickly to a message flagging a possible copyright concern, than a message flagging non-consensual child pornography.”
The suit goes on to claim that even after videos of the plaintiff had been flagged, Reddit’s lack of a policy preventing the posting of this type of content allowed her ex-boyfriend to simply post the video again, “often to the exact same subreddit.”
According to the case, it fell to the plaintiff, and not Reddit’s moderators, to search for and flag child pornography on no less than 36 subreddits. Even after the plaintiff successfully had her ex-boyfriend’s Reddit account banned, he was able to create a new account and continue posting videos of her from the same IP address, the suit says.
“To be clear, Reddit’s refusal to act has meant that for the past several years Jane Doe has been forced to log on to Reddit and spend hours looking through some of its darkest and most disturbing subreddits so that she can locate the posts of her underage self and then fight with Reddit to have them removed,” the complaint says. “She does this often, and her effort continues to this day. Despite these incredible efforts, without Reddit’s assistance the situation is hopeless.”
The lawsuit looks to represent anyone who has appeared in a video or image when they were under the age of 18 that was uploaded or made available for viewing on any website owned or operated by Reddit within the past 10 years, with a proposed subclass of California residents who fit the same criteria.
A Reddit spokesperson said in a statement to Law360 that child sexual abuse material, or CSAM, “has no place on the Reddit platform” and that its policies go above and beyond the law’s requirements to prohibit suggestive or sexual content involving minors or anyone appearing to be a minor.
“We deploy both automated tools and human intelligence to proactively detect and prevent the dissemination of CSAM material,” the spokesperson stated. “When we find such material, we purge it and permanently ban the user from accessing Reddit. We also take the steps required under law to report the relevant user(s) and preserve any necessary user data.”
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.