Apple Turns a Blind Eye to Child Sexual Abuse Material on iCloud, Class Action Lawsuit Says
Doe v. Apple, Inc.
Filed: August 13, 2024 ◆§ 5:24-cv-05107
A class action alleges Apple has deliberately chosen not to adopt safety measures that would help prevent the storage and distribution of child sexual abuse material on iCloud.
California
A proposed class action lawsuit alleges Apple, Inc. has deliberately chosen not to adopt safety measures that would help prevent the storage and distribution of child sexual abuse material (CSAM) on iCloud.
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
The 23-page lawsuit claims that Apple, despite its knowledge and admission that it has a “dire CSAM problem,” uses consumer privacy as an excuse for its decision not to employ industry-standard technology that would detect and disrupt the distribution of illicit photos and images stored on iCloud. Instead, Apple’s apparent commitment to data protection offers a “safe haven” for CSAM offenders at the expense of victims of child sexual abuse, the complaint contends.
“Privacy and safety need not be mutually exclusive,” the case argues, noting that Facebook, Google and Snapchat use a CSAM detection tool known as PhotoDNA that is “perfectly compatible” with end-to-end encryption. Apple has not adopted PhotoDNA, and although the company claims in its privacy policy that it screens and scans content on its products and services for CSAM content, the defendant fails to do so in practice, the suit contends.
“Compared to widely-used online storage providers like Google which provide millions of CSAM leads every year to [the National Center for Missing & Exploited Children] and law enforcement, Apple reports just a couple hundred despite having hundreds of millions of iCloud users,” the filing says.
Indeed, text messages from 2020 that were unearthed in an earlier lawsuit show the company’s anti-fraud chief Eric Friedman telling a colleague that Apple’s prioritization of privacy over “trust and safety” made it “the greatest platform for distributing child porn,” the suit relays. Friedman goes on to say that Apple has “chosen to not know” that such distribution is happening, the suit shares.
In August 2021, Apple announced its development of NeuralHash, a software designed to scan images stored on users’ iCloud accounts for CSAM, the case notes. However, the complaint shares that Apple soon abandoned the project, citing concerns about governmental mass surveillance.
“Apple’s incredulous rhetoric hedges one notion of privacy against another: Apple creates the false narrative that by detecting and deterring CSAM, it would run the risk of creating a surveillance regime that violates other users’ privacy,” the Apple CSAM lawsuit says. “By framing privacy and safety as a zero-sum game, Apple made choices to overlook safety, even though other tech companies have deployed technology to disrupt the distribution of CSAM.”
The lawsuit looks to represent anyone in the United States who has been a victim of child sexual abuse resulting from the transmission of CSAM on Apple’s iCloud service within the past three years.
Are you owed unclaimed settlement money? Check out our class action rebates page full of open class action settlements.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.