Alexa has been eavesdropping on you this whole time
Many smart-speaker owners don't realize it, but Amazon keeps a copy of everything Alexa records after it hears its name. Apple's Siri, and until recently Google's Assistant, by default also keep recordings to help train their artificial intelligences.
...
Saving our voices is not just an Amazon phenomenon. Apple, which is much more privacy-minded in other aspects of the smart home, also keeps copies of conversations with Siri. Apple says voice data is assigned a "random identifier and is not linked to individuals" - but exactly how anonymous can a recording of your voice be? I don't understand why Apple doesn't give us the ability to say not to store our recordings.
The unexpected leader on this issue is Google. It also used to record all conversations with its Assistant, but last year quietly changed its defaults to not record what it hears after the prompt "Hey, Google." But if you're among the people who previously set up Assistant, you probably need to readjust your settings (check here) to "pause" recordings.
...
I'm not the only one who thinks saving recordings is too close to bugging. Last week, the California State Assembly's privacy committee advanced an Anti-Eavesdropping Act that would require makers of smart speakers to get consent from customers before storing recordings. The Illinois Senate recently passed a bill on the same issue. Neither are much of a stretch: Requiring permission to record someone in private is enshrined in many state laws.