Porn Site Operators Hit with Class Action Over Alleged Child Pornography Content
by Erin Shaak
Doe #1 et al. v. MG Freesites, Ltd. et al.
Filed: February 11, 2021 ◆§ 7:21-cv-00220
A lawsuit alleges the operators of a number of pornography websites have unlawfully monetized and profited from images and videos of child sexual abuse.
A proposed class action alleges the operators of a number of popular pornography websites have monetized and profited from the creation, organization and dissemination of images and videos of child sexual abuse in violation of federal law.
The 37-page lawsuit was filed in Alabama by two pseudonymous plaintiffs who say they were victims of a childhood sex trafficking venture that the defendants benefitted from financially and participated in.
According to the case, defendant MindGeek S.A.R.L. and its network of MindGeek subsidiaries—who operate over 100 pornographic websites, production companies and brands, including PornHub.com, YouPorn.com, RedTube.com and Tube8.com—have made available and monetized videos of the plaintiffs, who were under 18 years old at the time of filming, in commercial sex acts and child pornography.
Alleging violations of the Trafficking Victims Protection Reauthorization Act, the lawsuit claims the defendants “worked together” with sex traffickers to earn a profit from commercial sex acts and child pornography involving the plaintiffs and proposed class members.
“Defendants have victimized and exploited this child sex abuse material for profit,” the complaint scathes. “Defendants created, organized, and disseminated images and videos on their websites that depict child sexual abuse, often referred to as child pornography. Each of these images and videos are crime scenes Defendants monetized.”
MindGeek boasts that more than 115 million users visit its pornography websites every day, with 42 billion visits to PornHub alone in 2019, the lawsuit begins. As part of the defendants’ network, the suit says, MindGeek company ModelHub allows “amateur” pornographers to upload content to PornHub and generate revenue based on the number of views the material receives while MindGeek retains 35 percent of the total revenue. The only requirement for the creation of a ModelHub account is that a user be over 18 years of age, which the company purportedly verifies through the upload of picture identification, according to the case.
The suit claims, however, that if a video posted by the user includes other individuals, MindGeek has “no effective process” to verify their ages. Per the case, “moderators” hired by MindGeek “eyeball” the performers “to see if they look young” but , according to the lawsuit, may be less likely, or less inclined, to flag videos if the performer is 15, 16 or 17 years of age “due to the ineffective system PornHub implemented.”
“There is no guarantee or check on whether or how any additional persons in the video are at least 18 years of age, and whether they consented or understood that the video would be uploaded for the profit of the ModelHub member,” the complaint states.
Per the suit, the ModelHub program and other pay, subscription and premium content have allowed MindGeek to profit from videos and images of commercial sex acts, including the rape of each of the plaintiffs.
The lawsuit goes on to allege that MindGeek uses data mining and title and tag generation to tailor content to particular users and provide “easy access” to child pornography, sex trafficking, and “any other form of child sexual abuse material.” Per the suit, one tag MindGeek uses to classify videos is “Teen,” which in 2018 was the seventh-most searched term on PornHub, according to the case. MindGeek’s chief operating officer reportedly told members of the Canada House of Commons’ Standing Committee on Access to Information, Privacy and Ethics that the term “teen” in the “adult world” means “18 to 25, 18 to 27.”
The case contends that MindGeek’s tags and suggested search terms “make it easier for pedophiles to find the exact content they want,” and allow “traffickers, rapists, or would-be criminals to go undetected” while recovering associated compensation.
Although child sexual abuse material has been reported to MindGeek, victims have stated that the content was left on the defendants’ websites for weeks or months after they requested its removal and, in many cases, was reuploaded again and again, the suit says. According to the case, MindGeek “completely fails to control the torrent of videos” on its websites depicting child sexual abuse, rape, and incapacitated or unwilling participants.
Despite the vast number of videos uploaded to the defendants’ sites, MindGeek’s “moderation team” consists of only 10 people with no prior training or experience to identify underage individuals, the lawsuit avers. According to the suit, moderators are essentially assigned to review roughly 1,100 minutes of video every hour given the average length of each upload, making it nearly impossible to ensure every child sexual abuse video is flagged and removed.
“This is an impossible task, and MindGeek knows that,” the complaint states.
Further, the case avers that moderators are incentivized not to remove child pornography given their yearly bonuses are based on the number of videos approved.
“This results in individuals fast-forwarding to the end of videos (or not reviewing them at all) and approving them, even if they depict sexual trafficking of children,” the lawsuit says.
Per the case, a December 2020 New York Times article sparked an investigation by the Canadian government into the Montreal-based MindGeek subsidiary that operates PornHub. While the company said it would increase moderation efforts, remove users’ ability to download videos and implement a better verification process, the lawsuit alleges the plaintiffs and proposed class members remain “at risk of irreparable harm” absent relief from the court.
The suit looks to require the defendants to identify and remove child pornography and implement corporate-wide policies and practices to prevent its continued dissemination.
“Because of the insidious nature of child sex trafficking and child pornography, all of this relief is necessary to protect the present and future interests of Plaintiffs and Class members,” the complaint states.
The lawsuit looks to represent the following proposed class:
“All persons, who were under eighteen years of age at the time they were depicted in any video or image, (1) in any commercial sex act as defined under 18 U.S.C. §§ 1591 and 1595, or (2) in any child pornography as defined under 18 U.S.C. § 2252A, that has been made available for viewing on any website owned or operated by the Defendants.”
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.