Amazon and Apple recently announced they would halt the use of humans to review conversations on their digital voice assistants. This move will give users more control over the privacy of their communications.
Apple announced it would quit using contractors to listen in on users through Siri to grade the voice assistant’s accuracy. This decision comes after an Apple whistleblower told the Guardian that while monitoring the accuracy of the digital Assistant, they regularly overheard private conversations about doctors’ appointments, drug deals and even couples having sex. Their job was to figure out what triggered Siri into action – whether the user had actually spoken, “Hey, Siri” or if it was something else, such as the sound of a zipper.
RELATED STORY:
Apple said it would discontinue the global analysis of those voice recordings while it examined the grading system. Users will be able to opt-out of reviews during a future software update. Cat Franklin, an Apple spokeswoman, said in an email to The Washington Post:
“We are committed to delivering a great Siri experience while protecting user privacy.”1
Amazon also recently updated its privacy policy regarding voice recording made by its Alexa service. Amazon users will now be able to opt-out of humans reviewing those recordings by selecting a new option in the settings of the Alexa smartphone app. Amazon employees monitor those recordings to help advance its speech-recognition technology.
RELATED STORY:
In May, the company tweaked Alexa privacy features, providing users the ability to delete recordings of their voices. Users could already opt out of letting Amazon develop new features with their voice recordings.
Most owners of “smart-speaker” devices don’t realize that Siri, Alexa and (until recently Google’s Assistant) retain recordings of everything they hear after their so-called “wake word” activates them. This helps to develop and train their artificial intelligence. By the way, Amazon founder Jeff Bezos owns The Washington Post.
Without an announcement, Google quietly changed its defaults last year. Google Assistant no longer automatically records what it hears after the prompt “Hey, Google.” Apple, although not clear in their terms and conditions, explained why it uses the data, saying its use is:
“to help Siri and dictation . . . understand you better and recognize what you say.”1
RELATED STORY:
The Apple whistleblower told the Guardian the shocking details of what he uncovered:
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”1
Apple responded by saying that the recordings accounted for just one percent of Siri activations and lasted only a few seconds. The company also said they were not linked to the users Apple ID.
In Ireland, Apple contractors were sent home during this transition and were simply told it was because the global grading system “was not working,” according to the Guardian. Managers remained on site, but employees were unsure as to how this would affect their employment.
RELATED STORY:
The Apple whistleblower also said the Apple Watch and the HomePod (a smart speaker) were exceptionally prone to random activation. A study in 2018 study by investment firm Loup Ventures discovered that HomePod’s Siri was accurate in answering standardized questions 52 percent of the time.
Source:
Please get on our update list today, as social media is strangling our reach: Join here: http://healthnutnews.com/join THANK YOU!