Google, Microsoft, Amazon, FaceFirst Hit with Biometric Privacy Class Actions Centered on IBM’s ‘Diversity in Faces’ Dataset
Last Updated on July 27, 2020
Vance et al. v. Microsoft Corporation
Filed: July 14, 2020 ◆§ 2:20-cv-01082
A class action claims Microsoft obtained Illinois residents' facial scans in violation of the state's Biometric Information Privacy Act.
Washington
Google LLC, Microsoft Corporation, Amazon.com and FaceFirst, Inc. were hit this week with substantially similar proposed class actions alleging the tech heavyweights have violated a novel Illinois privacy law amid an “arms race” to improve error rates for their respective facial recognition technologies.
In the push to improve the accuracy of their facial recognition products, with a particular focus on errors in identifying women and people of color, Google, Microsoft, Amazon and FaceFirst have run afoul of the Illinois Biometric Information Privacy Act (BIPA), which sets stringent rules for a company’s collection, storage, use, possession and sale of unique biometric identifiers such as fingerprints and facial scans, the lawsuits allege.
More specifically, the complaints, filed in Washington and California federal courts, claim the companies utilized IBM’s Diversity in Faces Dataset without complying with Illinois BIPA regulations.
Unlike numerical protectors, such as a Social Security number, an individual’s biometric identifiers cannot be changed and therefore pose a greater risk for identity theft or fraud, the suits relay. The danger posed by facial recognition scanning is that once a picture is taken of a person’s face and its biometric measurements are captured, that information can be used to identify the person any time they appear on the internet, the cases explain. Broadly, facial biometrics, to a greater extent than fingerprints, present “a grave and immediate danger to privacy, individual autonomy and liberty,” the lawsuits stress.
Under the Illinois BIPA, a private entity is prohibited from collecting or otherwise obtaining an individual’s biometric identifiers without providing written notice and obtaining a written release to do so, the complaints say. Moreover, the law prohibits a company from profiting off someone’s biometric identifiers, and requires the entity to develop a publicly available written policy establishing a retention schedule and guidelines for the permanent destruction of the sensitive data, according to the suits.
Further, the Illinois BIPA allows for a citizen to recover $1,000 or more in liquidated damages for negligent violations of the law and $5,000 or more for intentional or reckless privacy abuses, the cases state.
With regard to the defendants, the complaints say the companies’ facial recognition products marketed to Illinois retailers and businesses utilize facial scans contained within IBM’s Diversity in Faces Dataset. Per the suits, the defendants applied for and obtained the Diversity in Faces Dataset “in order to improve the fairness and accuracy of [their] facial recognition products and technologies,” including Microsoft’s Cognitive Services Face Application Program Interface and Face Artificial Intelligence service, Amazon’s Rekognition, FaceFirst’s self-proclaimed “world’s fastest and most accurate enterprise face recognition platform,” and the facial recognition software described as a “fundamental cornerstone” of many of Google’s largest consumer products and services.
“The Diversity in Faces Dataset contained the biometric identifiers and information of Plaintiffs and Class Members,” the suits read, claiming IBM “did not seek nor receive permission” from Illinois residents to include their images, let alone scan their faces or otherwise collect, obtain, store, use or profit from the data.
According to the lawsuits, the facial data provided to Google, Microsoft, Amazon and FaceFirst via the Diversity in Faces Dataset was extracted by IBM from images contained within a massive 100 million-photo Flickr dataset compiled by the company through its parent Yahoo! in 2014. Flickr did so “without informing or receiving the consent of the individuals who uploaded these photographs” nor of those who appeared in the pictures, the cases say.
The plaintiffs, who separately sued IBM earlier this year over its Diversity in Faces program, say they uploaded numerous photos to Flickr that were included in the dataset obtained by Google, Microsoft, Amazon.com and FaceFirst. None of the companies informed the Illinois residents that they’d collected, stored and used their biometric identifiers and information or of the specific purpose and length of term for which their facial scans would be collected, stored and used, the lawsuits say.
Ultimately, neither Google, Microsoft, Amazon.com nor FaceFirst obtained a written release from the plaintiffs or proposed class members authorizing them to collect, capture, receive, obtain, store, use or profit from their biometric identifiers, according to the cases.
“As a result of [the defendants’] misconduct, Plaintiffs and Class Members have no recourse for the fact that their biologically unique information has been compromised,” the suits allege.
The lawsuits look to represent Illinois residents whose faces appear in the Diversity in Faces Dataset obtained by Google, Microsoft, Amazon.com and FaceFirst.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.