Class Action Alleges Meta, OnlyFans ‘Blacklisted’ Certain Adult Performers from Social Media Advertising [DISMISSED]
Last Updated on October 17, 2024
Dangaard et al. v. Instagram, LLC et al.
Filed: February 22, 2022 ◆§ 3:22-cv-01101
A class action alleges Meta Platforms has colluded with OnlyFans to “blacklist” from social media advertising adult entertainment performers who work with the subscription site’s competitors.
Instagram, LLC Meta Platforms, Inc. Fenix Internet LLC Fenix International Inc. Facebook Operations, LLC
California
October 15, 2024 – Judge Tosses Adult Performers’ Meta, OnlyFans “Blacklist” Lawsuit
On September 23, 2024, a federal judge ruled in favor of Meta, Instagram and the companies behind OnlyFans in the proposed class action lawsuit detailed on this page.
Get the latest open class action lawsuits sent to your inbox. Sign up for ClassAction.org’s free weekly newsletter.
In a 16-page order issued that day, United States District Judge William Alsup granted Meta’s March 2024 motion for summary judgment on the grounds that the plaintiffs were unable to produce sufficient evidence to substantiate their claims.
Judge Alsup wrote that although the court had provided “ample opportunity” to the plaintiffs to present proof that Meta had falsely classified them as terrorists and blacklisted them, the individuals had “failed to find any proof,” the document says.
In addition, the judge contended that the plaintiffs presented no evidence that Meta had any knowledge of the adult performers’ contracts or business relationships with competing sites. Likewise, the women provided nothing more than “speculation” regarding potential customers they may have gained absent Meta’s allegedly anticompetitive conduct, the judge wrote.
Lastly, Judge Alsup charged that “[a]lthough [one plaintiff] vaguely asserts a connection between the alleged actions of Meta and an alleged economic harm, neither she nor have the other plaintiffs provided actual evidence of damages.”
The judge said he had “no choice” but to grant Meta’s motion for summary judgment in spite of the company’s “questionable recordkeeping” of its Dangerous Organizations and Individuals list.
Check out ClassAction.org’s lawsuit list for the latest open class action lawsuits and investigations.
A proposed class action alleges Meta Platforms has colluded with the operator of OnlyFans to effectively “blacklist” from Facebook and Instagram advertising adult entertainment performers who work with the subscription site’s competitors.
The 39-page lawsuit says that Meta and OnlyFans’ blacklisting “scheme” has harmed thousands of small entrepreneurs who rely on social media to promote themselves and earn a living, while adult entertainment performers associated exclusively with OnlyFans have seen no such harm.
At the same time, Meta has helped OnlyFans solidify its status as the dominant online subscription platform in the adult entertainment industry by way of deleting the accounts and/or blocking the visibility of performers affiliated with competing adult entertainment platforms, the complaint alleges.
“The scheme was intended to destroy the [adult entertainment] Platforms’ businesses, and either destroy the [adult entertainment] Providers or force them to work exclusively through OnlyFans,” the case alleges.
According to the complaint, professional adult entertainment performers such as the plaintiffs rely on social media such as Instagram and Twitter to guide customers to their pages on adult entertainment platforms. Social media is so important to the subscription-based adult entertainment industry that, without it, the lawsuit says, “the business model for the industry is dead.”
The lawsuit alleges that the “collusion” between Meta, Facebook, Instagram and OnlyFans operator Fenix International Ltd. has “methodically damaged or destroyed the businesses” of competing adult entertainment platforms and at the same time unfairly harmed the livelihoods of performers who use those platforms.
The suit says that although the online adult entertainment industry was a “vibrant, competitive market” as recently as late 2018 or early 2019, it was around that time that professional adult entertainers experienced “a drop-off in traffic and user engagement on social media platforms.” As the case tells it, the “deletion and hiding of posts” and subsequent drop in social media traffic for certain performers occurred suddenly, and was so “substantial and dramatic” that “only automated processes could be responsible.”
As a result, OnlyFans “began to grow incrementally, and then exponentially,” while its competition “stagnated or saw dramatically reduced traffic and revenue,” the lawsuit claims.
The suit alleges that the “blacklisting process,” i.e., when certain accounts are identified to social media platforms in a way that encourages their deletion or a reduction in their visibility, was accomplished first internally at Instagram and Facebook by way of automated classifiers or filters, which were then submitted to a shared industry database of “hashes,” or unique digital fingerprints. According to the complaint, this database was and is intended to flag and remove content made by terrorists and related “dangerous individuals and organizations (DIO).”
“However, the [adult entertainment] Performers blacklisted, and the [adult entertainment] performers injured by the blacklisting, were not terrorists and had nothing to do with terrorism of any kind,” the suit stresses.
The proposed class action looks to represent:
“All Adult Entertainment Providers, regardless of the label they use, such as performer, influencer or artist, who suffered economic injury because they either (i) used their Instagram or Facebook account to link to, promote, or demonstrate praise, substantive support, or representation of any competitor of OnlyFans at a time when those businesses were falsely designated as a Dangerous Individual or Organization (‘DIO’) under any past or present version of Meta’s DIO policy, or that of Facebook or Instagram or any of their predecessor or subsidiary entities or technologies, or (ii) were themselves falsely designated as a DIO; the class includes anyone who suffered damages from any shift in the scheme beyond the initial DIO tactic, such as suffering continuing effects through other computerized systems.”
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s free weekly newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.