Google and Apple halt employee processing audio recorded by voice assistants
07.08.2019

Google and Apple suspend reviewing of audio recorded by voice assistants for recognition enhancement. The reason for such news – the increased number of incidents occurring due to employees who misuse the procured personal information listening to private conversations.

There is always a risk that records get leaked. And such leaks have already happened a few times. For example, not so long ago Google Assistant, which decrypts commands, “misaddressed” files. Amazon employees told Bloomberg that they share users’ funny audio. Anytime they could leak the records to earn some money or damage a company.

Voice assistants collect information about users who can’t always control this process. Developers admit that devices can “hear” users all the time in order to recognise an activation command. Samsung warned customers that personal information spoken out close to a smart device can be recorded and transferred to a third party. And there could be employees who filter commands “manually”. Google Terms of Use, as well as Terms of Use of other major services, informed customers about the same.

Developers ensure that employees work with completely depersonalised data – random numbers, which have no link to user accounts, are assigned to records. Google and Apple claim that user voices in records are distorted before request processing. This obstructs user identification without which data becomes useless. But the extent to which depersonalisation is performed is at the discretion of the developer. We also don’t know who except employees have an access to the data and whether this data can be misused. There were cases when voice assistants’ records served as evidence: US police listened to the records made by Amazon Echo, fixed in the crime scene, during an investigation.

That’s why the decision Google, Apple and Amazon have to make appears to be an important administrative measure, and it would be great if the companies introduce it.

Although besides insider leaks there are also breaches which happen due to technical errors. Almost all the voice assistants transfer user requests as audio files via HTTPS, and sometimes via HTTP-protocol. Technologies of interception and decryption of even a secure web traffic already exist and are quite available. And companies can’t guarantee data safety when it is being transmitted from a device to a server.

Invasion into private life can happen when smart devices are around, microphone or camera is on, software recognises commands remotely and by error. When leak occurs or a data misuse takes place, regulators make developers take serious measures, including market activity suspension of a product.
Subscribe to get helpful articles and white papers. We discuss industry trends and give advice on how to deal with data leaks and cyberincidents.