Fallen Kell
Diamond Member
- Oct 9, 1999
- 5,929
- 387
- 126
Ok, so you really think companies who's main line of business is collecting data on people to then sell targeted ads to them is not going to collect data that you authorized them to collect so that they can look at it to better target ads to those people?But they're not actually recording before the wake word. I would think you'd know that.
These devices wait to hear a waveform that matches their wake word before they spring into action. Until that happens, they're not capturing anything for posterity. You just have to ask the police for proof. There have been a couple of cases where law enforcement demanded Echo voice data from Amazon, assuming they would have a recording of a possible murder... only to be disappointed as it turned out the speakers don't, in fact, record everything all the time.
The issue, as you noted, is that voice assistants occasionally mistake sounds for the wake word and capture unintended snippets. I'm glad companies offered more control over what happens with the data they review (Apple seems to be the best about it, since you can opt out entirely on setup), but I'll admit it's a mixed bag even if I know the recordings are anonymized and random. Ditto any visual information sent to improve AI; I don't think it's done in a way that can be seriously abused, but it's still the case that someone else might look at clips of your security cam or video doorbell footage.
However, I can't find instances where staff accessed specific users' Alexa/Assistant/Siri recordings... do you have links for that? I can see that Google fired someone in 2010 for listening to VoIP data, but that's clearly very different. The other reports I've seen haven't included voice assistant data.
The point is not that there's no potential for abuse; of course there is. It's that there's no evidence to support the dystopian fears some have. Amazon, Apple and Google are not recording everything you say, cackling with glee as they use your living room conversations to target ads and give free rein to stalkers. They aren't that malicious, and the technical requirements for doing this would very likely be impractical (massive amounts of storage and computational power to screen 24/7 recordings for hundreds of millions of people). As I like to put it: the truth is often far more boring than we want it to be.

Amazon Echo’s privacy issues go way beyond voice recordings
Hey Alexa, who are you sharing my data with?


Study Reveals Extent of Privacy Vulnerabilities With Amazon’s Alexa
Issues range from misleading privacy policies to post-approval changes in code.

Amazon fires more employees after breaching customer's privacy
Amazon has had to fire more employees for breaching customer's privacy. This time an employee gave your personal information to a third party.


300 Apple employees fired for listening to Siri’s private recordings
Apple has terminated 300 employees in Ireland, for listening to more than a 1000 of its digitial assistant, Siri’s, audio recordings of people’s conversations, and even intimate moments. The Company recently issued an apology for its digital assistant sharing parts of its recordings with quality...


Apple fires ‘hundreds’ of contractors over eavesdropping | The Week UK
Third-party workers were sent audio files containing ‘confidential information’
www.theweek.co.uk
Google fired 80 employees for abusing user data and spying on people, with some even sharing personal information outside the company, a new report says
Dozens of Google employees were fired between 2018 and 2020 for using internal tools to view personal user data, Vice's Motherboard reports.

Now these are just examples of people being fired for accessing the data or using it to stalk people who were caught by the internal monitors. Some of the above goes into the broad extent that the data is being shared and used by contractors. And some of the above shows how easy it is for someone to add a third-party connection to these devices that gives them access to the data. Many of these are doing EXACTLY what you say they are not doing, using these recording to listen into the living room conversations to better target ads. The technical requirements have been vastly dropping for doing this. Heck some of the companies tried to open up about what is being kept and gives you access to listen to the recordings, and you will be amazed with how much is there, and all of it can be used to target ads and train the algorithms.