Class Action Says Paravision Illegally Mined ‘Billions’ of Everalbum Photos to Develop Facial Recognition Tech
Walton v. Everalbum, Inc.
Filed: October 2, 2020 ◆§ 4:20-cv-06895
A privacy class action alleges Paravision mined the faces that appear in photos uploaded to Everalbum without first securing users' consent to do so.
California
Paravision faces a proposed class action that alleges the artificial intelligence software company has employed a “highly deceptive and illegal” solution to the problem of where to get enough photos that feature enough faces to create a robust, functional facial recognition database.
According to the 15-page privacy lawsuit, Paravision mined the faces found in billions of user photos uploaded to the website of cloud photo storage provider Everalbum, as well as Ever, the company’s app, to “fuel [its] AI machine.”
“While users may have thought they were merely ensuring the lasting storage of ‘Weekend with Grandpa’ photos, they were instead unwittingly ushering in a corporate surveillance dystopia,” the case says.
The lawsuit alleges Paravision’s “systemic and covert privacy intrusion” amounts to a plain violation of the Illinois Biometric Information Privacy Act (BIPA), a 2008 law that sets stringent disclosure requirements for private businesses that deal with individuals’ fingerprints, voiceprints, facial and retinal scans, hand geometry and other sensitive, unchangeable data.
Per the suit, the Illinois BIPA was enacted after Pay By Touch, known as “the largest fingerprint scan system” in the state, filed for bankruptcy and left unclear the fate of the biometric information of thousands of customers. After Pay By Touch’s bankruptcy, Illinois legislators realized the very serious need to protect the state’s citizens when it came to their sensitive identifying information, the case stresses.
“BIPA regulates the entire lifecycle of biometric data, from capture and collection to use and disclosure,” the complaint relays.
Despite the clear disclosure requirements of the Illinois privacy law, Paravision has nevertheless “disregarded the BIPA in its entirety” in an apparent attempt to achieve a number of goals for its facial recognition systems and their application with regard to point-of-sale processing, facility access control, video identity verification and others, the lawsuit claims. According to the suit, in order to get its software to the point that it was able to identify specific individuals in “a variety of settings, poses, outfits and lighting scenarios” with a high degree of accuracy and across ages, races and genders, Paravision utilized photos uploaded to Everalbum as an effective one-stop shop.
From the lawsuit:
“Unlike its competitors, who used datasets provided by external photo-sharing and -storage services, Paravision used its own, in-house solution, Everalbum, later known as the Ever app.
With billions of photos from millions of users, Everalbum offered a comprehensive and exclusive training field for developing its own facial recognition tools that could be sold to law enforcement, intelligence, and various private industries.
Paravision took advantage of the opportunity offered by the Everalbum user data. Paravision fed the billions of photos uploaded to Everalbum into its machine-learning system, scanned and captured the geometry of any faces featured in those photos (known as a faceprint), and then used those faceprints to train its enterprise facial-recognition offerings.”
The problem with this, the lawsuit says, is that Everalbum customers were entirely unaware Paravision was mining their photos, in part because Everalbum and, later, the Ever app and Paravision’s AI were “completely differentiated brands.” According to the case, consumers had no reason to know that their photos were being used without their permission to build facial recognition software.
“Paravision never meaningfully informed Everalbum/Ever app users that their photos were being exploited, and only added a throwaway disclosure to its privacy policy after NBC News reporters contacted the company in 2019 in advance of publishing an exposé detailing the origins of Ever AI’s facial-recognition systems,” the suit says.
The lawsuit looks to represent all individuals who had their faceprints collected, captured, received, or otherwise obtained by Paravision while residing in Illinois.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.