Voice assistants risk assessment - SearchInform

Voice assistants risk assessment

22.11.2019

Back to blog list

Voice assistant hazard identification: how to mitigate risk and ensure prompt risk evaluation?

Voice assistants became subject to investigative journalism — a few high profile publications made readers think twice before using a “helper” or even talking close to the microphone. And that is what we have learned this year:

  • Google, Apple and Amazon user audio requests are processed (=listened to) by real people.
  • Microphones occasionally capture and record private talks with doctors, business meetings and even criminal deals due to technical shortcomings of an assistant. Data loss prevention mechanisms are to be in place as in case of insider breach the company and employees will have to conduct investigation.

Extensive media coverage made tech giants halt audio processing which was conducted to perfect speech recognition algorithm — Apple and Google decided to do so a few months ago. Amazon didn’t stop listening to records but introduced an option of deactivating audio requests processing.

User behavior analytics should be considered when your company’s team is responsible for working with private information as it is crucial for consistent risk management program.

Google Pixel 4 which has been officially announced features a new technology regarding its assistant: the function awakens when a phone is lifted, taken close to a person’s face and clearly discerns “Hey Google” command. Although the feature reminds of the similar one in Apple Watches which sets Siri going by bringing the device “up-close”, Google assures that the new assistant will put efficiency and privacy first. The time session will be sort of limited to prevent accidental recording — Google Assistant will get deactivated after receiving a few requests.

According to the new research from Maintel, almost half of British people who took part in the survey questioned the safety of using virtual assistants. 47% don’t want to activate the technology thinking that voice processing by a third party isn’t protected. 46% are not enthusiastic about the functionality because they realise how much data can be collected this way. 44% suspect their phones are always on to digest their conversations.

About 30% of smart device market consumers use virtual assistants, and more than 60% of companies are to provide services via virtual assistants within three years.


Brands willing to dive into automated and robotic services should be ready to face regulatory compliance and data at rest protection issues as storing data as well as negligent usage might lead to a personal data leak. It is advised that risk management software is installed in order to facilitate control of data processing.

As the Guardian has recently reminded us about the former Amazon employee who came home and heard his Echo Dot uttering a horrifying monologue reproducing a number of commands his owner used to give him, the article emphasises Amazon’s Alexa ubiquity — the assistant invaded a number of home appliances — and reflects on the thought that the offline world will soon be a utopia.

Following the personal data scandals which evolved sweepingly and made users precautious about speaking close to electronic eavesdroppers, Amazon included the function allowing everyone to manage their voice recordings via Alexa app or in the Amazon account settings. Besides the commands you might have already told the assistant, “Alexa, delete what I just said” or “Alexa, delete everything I said today” are also recognised now if enabled in the settings.


Even more, gadgets will get backlit and play some music making you notice their activation and recording process.

 

Apple Siri’s users are not empowered with recordings management and can’t delete their data, but can delete the Apple account. Although, older recordings which the company has already depersonalised will be used for the assistant enhancement.

 

Besides shady depersonalisation policies, there are still some issues which just can’t be solved today and which can’t make the assistants become harmless:

 

a phone doesn’t have sufficient computing power to process speech — all the recognition and analysis processes are conducted in the cloud, and your phone can only record and transmit the data to the cloud, apparently making it leave its comfort zone;

if you pay close attention to Google’s Terms of Service, you’ll read the following — “When you upload, submit, store, send or receive content to or through our Services, you give Google (and those we work with) a worldwide license to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations or other changes we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content… This license continues even if you stop using our Services.”

 

Google claims that some Services allow a user to slightly limit exposure in the settings, although it is quite unclear where your rights get terminated and Google’s ownership begins.


Devices Third party Personal data


Subscribe to get helpful articles and white papers. We discuss industry trends and give advice on how to deal with data leaks and cyberincidents.

هل ترغب بالانتقال الى الصفحة الرئيسية,
او التعرف على المزيد عن الخدمات لمنطقة الشرق
الاوسط و شمال افريقيا؟
Do you want to visit main website
or learn more about MSS for MENA market?