Meta Failed to Disclose Addicting, Damaging Effects of Facebook, Instagram on Children, Lawsuit Says
by Erin Shaak
Last Updated on November 18, 2022
Embattled social media giant Meta Platforms has been hit with a proposed class action from a mother who claims her 12-year-old daughter and other pre-adolescents have been targeted and harmed by the company’s Facebook and Instagram platforms for the sake of profit.
The 47-page lawsuit, initially filed in state court and removed to California’s Northern District Court on October 27, says Meta has claimed “a staggering number of times” that its social media platforms are safe for children and teenagers and were not designed to be addictive. The suit contends, however, that these representations “could not be further from the truth.”
Want to stay in the loop on class actions that matter to you? Sign up for ClassAction.org’s free weekly newsletter here.
According to the case, a series of “bombshell revelations” from whistleblowers and government investigations have recently revealed that Meta intentionally designed Facebook (and later Instagram) to play off children’s unique vulnerabilities and ensnare them in an “unending stream of content” to maximize engagement—the bread and butter of Meta’s business model. Meta knew all along that the use of its social media platforms would be addictive and devastating to children’s mental health, yet saw an opportunity to bring in “vast streams of money” through advertising, the lawsuit contends.
Citing a 2017 interview with Sean Parker, Meta’s first president, the lawsuit claims the social media giant has essentially “exploit[ed] a vulnerability in human psychology” in order to make a profit.
“God only knows what it’s doing to our children’s brains,” Parker said. “…The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway.”
The suit argues that although Meta is well aware that vulnerable pre-adolescents use Facebook, it has put no meaningful identity verification methods in place even though federal law prohibits websites from gathering data from users who are under 13 unless specific requirements are met.
The case goes on to claim that Meta has unlawfully designed its products in a way that discriminates against users based on their gender. The suit says the algorithmic tailoring used on both Facebook and Instagram is so heavily influenced by a user’s gender, whether they’ve disclosed that information or not, that the newsfeeds they are shown “are always determined based on [their] gender.” In this way, Meta has reinforced gender stereotypes around appearance and body image to its users’ detriment, the case alleges.
Meta’s conduct, the lawsuit summarizes, has “made it billions” at the expense of its minor users and in violation of the California Unruh Civil Rights Act and various state consumer protection statutes.
Meta “harnessed” children’s vulnerabilities, lawsuit says
The lawsuit explains that because adolescents’ brains are still developing into maturity, they are uniquely vulnerable to social stimuli that can “induce destructive behavior.”
Per the suit, laws and social norms have developed to protect this age group by, for example, restricting them from buying alcohol, cigarettes, and movies and video games with violent or sexual content, or engaging in gambling or other activities that require self-control in response to positive stimuli.
The suit claims, however, that in the face of what “the rest of society” has viewed as a danger to children, “Meta saw an opportunity.”
“Meta realized that the features of the pre-adolescent brain that make them vulnerable could be harnessed to make pre-adolescents use Facebook and Instagram more and more—increasing the products’ profitability. And so Meta built features into its Facebook and Instagram products that were designed to provide rewards to users in intermittent and variable ways—similar to the function of a slot machine.”
According to the suit, Meta began targeting this pre-adolescent market as early as 2006, when it eliminated the requirement that users provide a .edu email address and allowed children as young as 13 to use Facebook, which before had been restricted to college students. Even children younger than 13 could easily obtain access to the platform by lying about their age or whether they had parental permission, the case notes.
The lawsuit claims Meta was fully aware that this vulnerable group of pre-adolescents would become hooked on Facebook and Instagram yet took no measures to protect them, even after it became apparent that the platforms were designed in a way that was particularly destructive to minors’ mental health.
“Profit over safety”
Former Facebook data scientist and whistleblower Frances Haugen has stated in interviews that she observed “over and over again” that Facebook consistently chose “profit over safety” and prioritized its own interests over that of the public.
Per the complaint, the introduction of the “Like” button to Facebook in 2009 after “extensive testing” transformed the platform into a competition for social validation that became particularly damaging to pre-adolescents, who have been found to experience emotional distress and depressive symptoms when they don’t receive enough validation on social media.
The lawsuit says Meta’s own internal research “confirmed exactly this” and identified validation in the form of “likes” as something that harms Instagram users’ mental health.
Per the case, Meta was nevertheless “consciously driven by a desire to induce addiction or compulsive use in Facebook users” and continued to develop features that would maximize engagement even at users’ expense.
One such feature, according to the suit, was the unlimited news feed on both Facebook and Instagram that uses algorithms to show content “likely to generate strong reactions.” Per the lawsuit, three U.S. senators tested Instagram’s newsfeed in 2021 by opening accounts for fictional teenage girls and found that the content “went dark fast.” One of the accounts “was flooded with content about diets, plastic surgery and other damaging material for an adolescent girl,” the case says, quoting an NPR article.
According to the lawsuit, Meta’s use of gender as a data point in its news feed algorithms has “perniciously reinforce[d] gender stereotypes” around appearance and body image, with especially damaging effects for young girls.
“These revelations show that just as gender discrimination in the physical world produces devastating consequences, discrimination in the digital world can produce the same (or even worse) harms,” the complaint reads.
The lawsuit alleges that even though Meta has repeatedly touted its algorithms as safety tools meant to protect users, there exists “extensive evidence that this is simply not the case.”
The plaintiff’s experience
The plaintiff in the case is a pseudonymous Washington resident who filed the lawsuit on behalf of her 12-year-old daughter.
The case relays that the plaintiff’s daughter began using Meta’s products around the age of 10 without her parents’ knowledge or consent. Per the case, the plaintiff’s daughter continues to access Meta’s products despite her mother’s efforts to stop her and sometimes “sneaks to the family computer in the late night” to do so, or accesses the platforms while visiting friends.
The plaintiff says she is unable to effectively keep her daughter from using the social media platforms “without imposing draconian restrictions on her daughter.”
Per the case, the plaintiff’s daughter has already experienced “serious harm” from using Meta’s products and has developed “troubling body image issues.”
“For example, J.P. has said that when she grows up, she want [sic] to become a plastic surgeon, so she can correct the ways J.P. believes her body is flawed,” the lawsuit says.
The plaintiff says that in the “many conversations” she’s had with her daughter about these body image issues, Meta’s products and the stereotypes reinforced in their newsfeeds “have been a consistent theme.”
Who does the lawsuit look to cover?
The case looks to represent anyone under the age of 13 who has a Facebook or Instagram account and used the account for at least 25 hours.
My child has used Meta’s platforms. How do we join the lawsuit?
There’s usually nothing you need to do to join or be considered part of a class action lawsuit when it’s first filed. If the case moves forward and settles, that’s when those covered would receive notice of the settlement and have an opportunity to file a claim for their share.
If you want to stay in the loop, you can check back to this page for updates or get class action lawsuit news sent straight to your inbox by signing up for ClassAction.org’s free weekly newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.