‘Constantly Listening’: Class Action Claims Amazon’s Alexa Makes ‘Secret Recordings’ of Users’ Conversations
by Erin Shaak
Terpening v. Amazon.com, Inc.
Filed: May 18, 2021 ◆§ 5:21-cv-03739
A lawsuit claims Amazon’s Alexa virtual assistant secretly makes and stores recordings of conversations even when consumers do not intend to activate the device.
California
Amazon.com, Inc. faces a proposed class action lawsuit that claims the tech giant’s Alexa virtual assistant secretly makes and stores recordings of conversations for the company’s benefit, even when users have no intention to interact with the device at all.
The 21-page lawsuit alleges Amazon has deceived users of Alexa-enabled devices, including the Echo, Echo Dot, Echo Plus, Echo Sub, Echo Show, Echo Input, Echo Flex, and Echo Dot Kids, by falsely representing the extent to which the products record users’ interactions and how the company uses the recordings. Although Amazon represents that Alexa will respond and begin recording only when it “hears” a “wake word,” such as “Alexa” or “Echo,” customers, the case says, have discovered that their devices will sometimes misinterpret a purported wake word and activate, even when users do not intend to interact with the device.
According to the suit, customers understand based on Amazon’s representations that recordings of their interactions with Alexa will be sent “to the cloud” for the purpose of determining the user’s “intent” and how best to respond to their request. What consumers are not told, however, is that the defendant uses human and artificial intelligence analysts to “listen to, interpret, and evaluate” the recordings for business purposes and in violation of consumers’ privacy rights, the case charges.
“Amazon never disclosed its widespread creation, storage, and use of those records for its own business purposes that extend beyond improving or personalizing Alexa’s services,” the complaint alleges. “Amazon’s conduct in surreptitiously recording consumers has violated federal and state wiretapping, privacy, and consumer protection laws.”
When a user addresses an Alexa device using a wake word, the case explains, it automatically begins recording and captures the following question or command, such as “Alexa, turn on the lights” or “Alexa, play I Am the Walrus by the Beatles” or “Alexa, what is the number for the suicide hotline?” The recordings are then sent to Amazon’s Alexa Cloud, a cloud-based storage and manipulation service, the lawsuit says. The Alexa Cloud then “translates” the text from the recording into an “intent,” or a language statement understood by the computer, after which the “intent” is sent back to the Alexa device with instructions on how to act on the intent using a functionality known as a “skill,” e.g., playing music, unlocking a smart lock, performing an internet-based search, or turning on the lights through a smart outlet, the case says.
Though Amazon represents to consumers that Alexa will only activate and begin recording upon “hearing” a wake word and that its recordings are only streamed to the Alexa Cloud to respond to the users’ command, the defendant has failed to disclose that it keeps recordings of consumers’ interactions to be used for training and improving its artificial intelligence and other business purposes, the case alleges.
Further, the lawsuit attests that Alexa device users are unaware that Amazon records and stores conversations and other sounds “that users never intended for Alexa to hear.” Per the case, an Alexa-enabled device may activate without any legitimate prompt from a user by “mishearing” speech that it interprets as a wake word. For example, the suit says, Ars Technica has reported that devices like Alexa are incorrectly triggered by thousands of words, including, in Alexa’s case, “unacceptable,” “election” and “a letter.” When Alexa is incorrectly triggered by mishearing a conversation or even a TV or radio program, the device automatically begins recording and stores “conversations, speech, and other sounds that are private in nature and not intended for Alexa—or Amazon’s—‘ears,’” the lawsuit alleges.
“Thus,” the complaint reads, “not only are [sic] Amazon unlawfully and deceptively storing recordings of intentional interactions with Alexa, but also speech and other sounds never intended to be ‘heard’ by Alexa at all.”
Per the case, the data recorded and stored by Amazon can be aggregated with details from other sources to “create fulsome profiles of information that may include deeply personal and private information” about consumers that can then be used for Amazon’s business purposes. The lawsuit claims such clandestine recording of user’s private information presents serious privacy risks:
“Not only does this covert recording, storing, and analyzing of consumer information entail a profound violation of privacy, but given the increasingly all-encompassing scope of Alexa Devices’ ability to interface with aspects of a consumer’s life (e.g., controlling home security features such as locks and lights, and providing access to personal medical or identity-related information) this storage—unnecessary to the functionality of Alexa Devices—creates a risk from hacking or other unauthorized leveraging of consumer data and processes by third parties (or Amazon personnel).”
The case claims the recordings, including conversations and other data produced by children, are indefinitely stored by Amazon regardless of whether the person recorded has a contractual relationship with the company.
The only way to stop Alexa devices from making and storing recordings is to turn off or unplug the device, which defeats the product’s utility, the suit says. Moreover, although Amazon in 2019 gave users the functionality to delete the information obtained through a smart speaker, a user cannot stop an Alexa device from making recordings in the first place, according to the complaint.
The lawsuit looks to cover all U.S. citizens who owned or used an Alexa device or downloaded and used the Alexa app within the past four years, or the longest applicable statute of limitations period, and until the date of judgment in the case.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.