Healthcare Workers Claim Amazon Alexa Devices Secretly Record Private Conversations
by Erin Shaak
Last Updated on July 3, 2024
Scott et al. v. Amazon.com, Inc.
Filed: June 30, 2021 ◆§ 2:21-cv-00883
A proposed class action claims Amazon’s Alexa virtual assistant secretly records conversations users never intended Alexa, or anyone else, to hear.
Washington
Four healthcare workers have filed a proposed class action in which they claim Amazon’s Alexa virtual assistant secretly records conversations users never intended Alexa, or anyone else, to hear.
According to the 24-page suit, Amazon has failed to disclose that Alexa-enabled devices such as the Echo, Echo Dot, Echo Plus, Echo Sub, Echo Show, Echo Input, Echo Flex and Echo Dot Kids may activate and begin recording even when users have not spoken a “wake word,” such as “Alexa.” Moreover, the case alleges customers are not told that Amazon stores, analyzes and uses Alexa-enabled device recordings for its own business purposes, and not to “respond to [a user’s] requests and improve [Alexa’s] services” as the company represents.
The plaintiffs, a group that includes a psychiatric worker and substance abuse counselor, claim their Alexa devices may have recorded HIPAA-protected information without their intent and sent the recordings to Amazon, whose artificial intelligence and human employees or contractors may have listened to and analyzed the conversations. Per the case, the healthcare workers would not have purchased the devices had they known Amazon would use them in such a way.
“At the time Plaintiffs and putative class members purchased their Alexa Devices, Amazon failed to disclose its widespread creation, storage, and use of those records for its own business purposes that extend beyond improving or personalizing Alexa’s services,” the complaint attests. “Instead, Amazon represented that Alexa sent audio to the cloud for the sole purpose of generating an appropriate response.”
The case claims Amazon’s conduct has violated federal and Washington state wiretapping, privacy and consumer protection laws.
According to the lawsuit out of Washington federal court, Amazon represents that its Alexa-enabled devices activate when they hear a “wake word,” such as “Alexa” or “Echo.” When a wake word is detected, the device begins recording audio and then sends the recording to the Amazon Alexa Cloud, a cloud-based data storage and manipulation service, the case explains. The Alexa Cloud then transcribes the user’s voice into text, translates the text into an “intent” and then sends the intent back to the Alexa device, which acts on the statement using a given functionality known as a “skill,” the suit says.
What customers are not told, the lawsuit alleges, is that the recordings made by Alexa devices are stored permanently and “reviewed freely” by Amazon, its employees and third-party contractors. Although Amazon has represented for years that the recordings were only used to allow Alexa to respond to a user’s command and help personalize its responses, the defendant failed to disclose that it also uses the recordings to train and improve its artificial intelligence projects and “for other business purposes not essential for the functioning of Alexa Devices,” the case argues.
Moreover, Amazon has allegedly failed to disclose that it “stores, retains, and analyzes recordings of conversations that users never intended for Alexa to hear,” including when the device misinterprets a wake word. Per the case, Alexa and other smart devices respond to hundreds of words that are not supposed to activate them, including “unacceptable,” “election” and “a letter.” The suit cites a Northeastern University study in which researchers found that smart speakers, including Alexa devices, “wake up” and start recording when no wake word was spoken as frequently as 19 times per day and at least 1.5 times per day on average.
The lawsuit alleges Amazon knew at the time the Alexa devices were sold that users’ and their guests’ voices would be recorded without their intent and that the recordings would be analyzed and used by Amazon for its own business purposes, including to assemble “fulsome profiles of information that may include deeply personal and private information.” Nevertheless, the defendant has “taken no remedial action to address the interception” of users’ communications and instead “sought to continue and expand it,” the case charges.
The lawsuit looks to represent all adult U.S. citizens who owned and used an Alexa device or downloaded and used the Alexa app during the past four years (or the length of the longest applicable statute of limitations period) and through the date of judgment.
Get class action lawsuit news sent to your inbox – sign up for ClassAction.org’s newsletter here.
Hair Relaxer Lawsuits
Women who developed ovarian or uterine cancer after using hair relaxers such as Dark & Lovely and Motions may now have an opportunity to take legal action.
Read more here: Hair Relaxer Cancer Lawsuits
How Do I Join a Class Action Lawsuit?
Did you know there's usually nothing you need to do to join, sign up for, or add your name to new class action lawsuits when they're initially filed?
Read more here: How Do I Join a Class Action Lawsuit?
Stay Current
Sign Up For
Our Newsletter
New cases and investigations, settlement deadlines, and news straight to your inbox.
Before commenting, please review our comment policy.