Alexa, stop recording our kids!
So charge a pair of lawsuits filed against Amazon this week, as first reported by the Seattle Times, which allege that the devices are breaking the law in at least eight states by recording children who use the smart speakers without consent.
“Alexa routinely records and voiceprints millions of children without their consent or the consent of their parents,” states one complaint filed on behalf of a 10-year-old Massachusetts girl in Seattle federal court on Tuesday. And another, almost identical suit was filed the same day on behalf of an 8-year-old boy in California Superior Court.
The federal complaint, which seeks class action status, notes that Alexa devices record and transmit anything someone says after the “wake word” activates the speaker – usually “Alexa,” although it can also be “Echo” or “Amazon”. Company practice then includes saving “a permanent recording of the user’s voice,” regardless of the speaker and whether that person was the one who purchased the device and/or installed the Alexa app or not.
Related: Gadgets designed to keep older adults safe may be invasive to privacy
While the system is able to identify individual speakers by their voices, Amazon could inform users who had not already consented that they were in fact being recorded, and then ask them for permission, the suits claim. Permanent recording for users who had not consented, such as underage children, could then be deactivated, but the suit alleges Amazon has opted out of this practice.
“Alexa does not do this,” the complaint says. “At no point does Amazon warn unregistered users that it is creating persistent voice recordings of their Alexa interactions, let alone obtain their consent to do so.”
The lawsuit also claims this breaks the law in Florida, Illinois, Michigan, Maryland, Massachusetts, New Hampshire, Pennsylvania and Washington – all states which call for all parties to consent to a recording, no matter how old they are. The cases are seeking damages for the two plaintiffs involved, as well as other customers who are invited to join the class-action lawsuits in those eight states and California.
The California suit adds that apart from general privacy concerns, “it takes no great leap of imagination to be concerned that Amazon is developing voiceprints for millions of children that could allow the company (and potentially governments) to track a child’s use of Alexa-enabled devices in multiple locations and match those uses with a vast level of detail about the child’s life, ranging from private questions they have asked Alexa to the products they have used in their home.”
“Amazon has a longstanding commitment to preserving the trust of our customers and their families, and we have strict measures and protocols in place to protect their security and privacy,” Amazon said in a statement to Marketwatch. “For customers with kids, we offer FreeTime on Alexa, a free service that provides parental controls and ways for families to learn and have fun together.”
The company also notes in its children’s privacy disclosure, however, that “in some cases, we may know a child is using our services. In these situations, children may share and we may collect personal information that requires verifiable parental consent under the Children’s Online Privacy Protection Act.”
More than 100 million Alexa devices have been sold worldwide, according to Amazon data reported by The Verge earlier this year, and this includes more than 150 products with Alexa built in, as well as more than 28,000 smart home devices that work with Alexa.
Technology giants such as Amazon, Google and Facebook have come under increasing scrutiny about how they are gathering and using consumers’ personal information, and just how much voice-activated smart assistants are listening into users’ lives. In December, for example, a German man accidentally received 1,700 audio files from a complete stranger when he asked Alexa to play back recordings of his own activities. An Amazon spokesperson said at the time that “this unfortunate case was the result of a human error and an isolated single case.”