Although the smart speakers in our home make life much easier when it comes to finding information without having to access a mobile phone or computer, they can also be an important security breach if we end up neglecting it, and a recent study leaves the very clear things about it.
It is likely that your smart speaker has ever heard a word or phrase that it should not have heard, and that can be a serious privacy problem and a violation of your security if it has also been heard by a human engineer on the other side of the server.
As you well know, the operation of smart speakers does not keep much mystery since these types of devices first wait for some activation word or phrase, and from there it is when we ask our doubt, a question that will end up on the servers, to the automatic algorithm of the device in question, and from there we will receive a response. The problem comes that to perfect this automatic algorithm many times these questions also reach the ears of human engineers.
Now a study by Ruhr Bochum University and the Max Planck Institute for Security and Privacy has found more than 1,000 words and phrases that Alexa, Siri, and Google Assistant assistants have often mistakenly identified as “activation commands,” something known as false positives.
An activation command is when our smart speaker or virtual assistant “wakes up to listen actively” to answer our questions and surely phrases like “Hey Google”, “Hello Cortana” or “Hey Siri”, among others.
What this study has come to point out, is that there are words similar to these activation commands that are confused by the device in question, and that makes them activate without us noticing, being able to record later what we are saying.
According to the mentioned study, these are some false positives:
- Alexa: “unacceptable,” “election” and “a letter”
- Google Home: “OK, cool,” and “Okay, who is reading”
- Siri: “a city” and “hey jerry”
- Microsoft Cortana: “Montana”
For example, imagine that by mistake instead of saying “Cortana” you indicate “Montana”, and there the device is activated in listening mode, and then your next phrase is the PIN of your bank card to close a reservation to Montana. Well, this information reaches the device’s servers, and with a minimal chance that it will be heard by an engineer.
That is why the study gives a series of recommendations so that we can prevent these failures in the activation commands:
- Change the word or phrase of activation of your smart speaker whenever possible, trying to choose a phrase or word less prone to error.
- In the case of Google Home you have an option to reduce the activation sensitivity.
- Certain devices also allow you to mute the device’s microphone.
- Don’t always have your smart speakers turned on if you’re not really using them regularly.
- Delete all recordings and update security options in each of the accounts of these electronic devices so that the audio is never saved, listened to or shared with engineers.
In this way, using the tools that these speakers put on the tray and also some common sense, we can avoid falling into one of the most common mistakes that continue to be made with these smart speakers.