Apple and Amazon have announced that they are curtailing the use of humans to review conversations on their digital voice assistants like Siri and Alexa. Such contractors listen in on users on the digital voice assistants to grade their accuracy or to improve its speech-recognition capabilities.
An Apple whistleblower reported that they often overhear doctors’ appointments, drug deals and even couples having sex. Furthermore, the recordings are accompanied by user data that give away important information like locations, contact details and app data.
Read the full article on The Straits Times: Hey, Siri reviewer, no more listening to sex talk
How do we balance between the right to privacy of consumers versus the move to improve the accuracy of the technology that has pervaded our daily lives? Additionally, by using the digital voice assistants, do users implicitly imply that their recordings may be used to improve their user experience?
An improvement in the accuracy of the digital voice assistants would better the users’ experience of the speech-recognition technology. In furtherance of this aim, Apple and Amazon have employees review the voice recordings so that it could offer better-matched patterns for the voice results.
Yet, consumers are mainly disturbed because they did not know that the privacy policies which they signed up for meant that there would be human reviewers of what they record in the digital voice assistants. On one level, how many of us read the privacy policies and agreements when we sign up for online services? Even if we do read the policies, it is not clear that data used to improve the recognition feature mean that there would be human employees going through what we search vocally.
This technological quagmire could have been averted if the users were given clear warning about how their voice recordings would be used. That said, one can imagine that most users would opt out of this if they could and there might be insufficient data to improve the voice-recognition technology. Moreover, users’ privacy could be protected if it were sufficiently anonymised and that the voice recordings were not linked to their user data.
Questions for further personal evaluation:
- In the pursuit of technological progress, how far can technology providers go in relying on users’ data?
- ‘curtailing’: to make less by or by cutting off or away some part
- ‘quagmire’: a difficult, precarious or entrapping position; predicament