Authors File Class Action Lawsuit Over Anthropic’s Allegedly Illegal Use of Copyrighted Books to Train AI Chatbot Claude
Bartz et al. v. Anthropic PBC
Filed: August 19, 2024 ◆§ 3:24-cv-05417
Three authors have filed a class action against Anthropic, claiming the company used pirated copies of their copyrighted books to train its AI chatbot, Claude.
Three authors claim in a proposed class action lawsuit that Anthropic used pirated copies of their books to train its artificial intelligence (AI) chatbot, Claude.
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
The 20-page fraud lawsuit alleges that Anthropic, a San Francisco-based startup founded by ex-OpenAI employees, has built its multibillion-dollar business through the “brazen infringement” of the plaintiffs’ intellectual property rights. According to the filing, the defendant intentionally downloaded hundreds of thousands of copyrighted books from illegal websites and fed unlicensed copies of the works to Claude, all without permission from or compensation to the rightful copyright owners.
“The end result is a model built on the work of thousands of authors, meant to mimic the syntax, style, and themes of the copyrighted works on which it was trained,” the suit contends. Books are especially valuable training material for large language models (LLM), as they help AI programs grasp long-term context and generate coherent narratives of their own, the case says.
The complaint claims Anthropic has admitted to training its AI model using the Pile, a dataset that includes a trove of pirated books. A large subsection of the dataset known as Books3 consisted of 196,640 titles downloaded from an infamous “shadow library” site, the case charges.
“Anthropic, in taking authors’ works without compensation, has deprived authors of books [sic] sales and licensing revenues,” the filing argues. “There has long been an established market for the sale of books and e-books, yet Anthropic ignored it and chose to scrape a massive corpus of copyrighted books from the internet, without even paying for an initial copy.”
The Claude copyright infringement lawsuit follows a series of similar cases filed by authors against AI chatbot developers such as Meta, Google and OpenAI. Artists, too, have taken legal action against makers of AI image creation tools like DreamUp, Midjourney and DreamStudio over the allegedly illegal appropriation of their protected works.
The suit looks to represent all persons, estates or literary trusts that are legal or beneficial owners of copyrighted books that are registered with the United States Copyright Office and were or are used by Anthropic in LLM training, research or development, including but not limited to training the defendant’s Claude family of AI models.
Are you owed unclaimed settlement money? Check out our class action rebates page full of open class action settlements.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.