Class Action Claims Clarifai ‘Harvested,’ Built Database of OKCupid Users’ Profile Pictures for Facial Recognition Tech [UPDATE]
Last Updated on July 15, 2024
Stein v. Clarifai, Inc.
Filed: February 13, 2020 ◆§ 2020CH1810
A class action alleges Clarifai violated Illinois law by harvesting and storing facial scans of OKCupid users' profile photos.
Case Updates
April 23, 2021 – Lawsuit Dismissed
A judge has dismissed the lawsuit detailed on this page due to lack of personal jurisdiction.
According to U.S. District Judge Sara L. Ellis’s March 16 order, the court does not have jurisdiction over the matter because the plaintiff failed to sufficiently allege that Clarifai directly targeted Illinois.
“Specific jurisdiction exists ‘when the defendant purposefully directs its activities at the forum state and the alleged injury arises out of those activities,’” the judge wrote.
Clarifai contends that, while it has sold products to “a small number of Illinois customers,” it “merely runs an interactive website” available to Illinois residents. In fact, the defendant’s 58 sales to two Illinois customers generated a grand total of just seven cents in revenue for Clarifai, according to the March 6 order.
Judge Ellis further noted that the plaintiff has not rebutted the defendant’s claim that it does not develop or train facial or artificial technology in Illinois and “has no knowledge of where the OKCupid users whose information it used reside.”
The plaintiff’s claims were dismissed without prejudice, meaning she could potentially re-file her case.
Clarifai, Inc. finds itself as the defendant in a proposed class action that alleges the artificial intelligence outfit violated Illinois’ biometric privacy law by harvesting unique identifiers from the profile pictures of thousands of OKCupid users to develop and train its facial recognition technology.
At the heart of the complaint is the Illinois Biometric Information Privacy Act (BIPA), the state law enacted in 2008 in order to protect citizens from private entities obtaining, capturing, storing, using and profiting from their unique biometric identifiers—e.g., facial scans—without first fulfilling stringent disclosure requirements. The lawsuit alleges that Clarifai, in direct violation of the BIPA, has secretly accessed OKCupid users’ profile pictures and “continuously and systematically” marketed and sold the facial recognition technology developed from these photos without the authorization of proposed class members.
It first came to light last summer, the suit says citing a July 2019 New York Times article, that Clarifai had constructed a “massive” database of OKCupid users’ profile pictures obtained through the dating website. The purpose behind Clarifai’s OKCupid photo database, the lawsuit continues, is to develop and train the algorithms used by the company in its facial recognition technology. According to the case, the defendant has created thousands of unique face templates by scanning OKCupid users’ photos and extracting the geometry of each face contained therein. At no point were Illinois OKCupid users notified that their facial geometries, considered a unique identifier under the BIPA, were collected, the suit charges.
Clarifai further violated the Illinois law by failing to outline for OKCupid users the purpose and length of time for which their facial geometries would be collected, stored and used, according to the complaint.
According to the complaint, Clarifai was able to access users’ photos with the help of an investor, a Chicago-based venture capital group launched by OKCupid’s founders. As the suit tells it, Clarifai first contacted OKCupid back in 2014 to ask about collaborating on AI and facial recognition technology, and the company’s “surreptitious harvesting of biometric identifiers” is still ongoing.
The proposed class action looks to cover all Illinois resident who had scans of their face geometry “collected, captured, purchased, received through trade, or otherwise obtained” by Clarifai within the last five years.
Before commenting, please review our comment policy.