Smart assistants, such as Amazon Echo or Google Home, are changing habits and making the lives of more than 45 million users in the US more organized.
Virtual home assistants may look like stylish music speakers matching your home interior. But don’t let this outer simplicity trick you – these gadgets are smart and powerful enough to handle things you used to do by yourself. They are speakers supercharged with AI and voice recognition technology – wireless, connected to the Internet, and capable of understanding your voice commands.
Need to set a mood a bit? Ask it to play music or a specific song. Wondering if you should take an umbrella when going out? Ask about the weather forecast for today. Feeling witty? Ask a funny question and let it entertain you.
People of all demographics embrace the technology of smart assistants to get directions, find answers to questions on general topics and set up alarms and reminders. However, the convenience comes with a serious trade-off – your privacy. Smart, voice-enabled assistants are always listening. Let’s take a look at what security dangers they may cause.
Smart Home Assistants: What Are the Security Risks?
History of Your Requests
Every request you make to your home assistant is sent to the company’s servers as an audio recording. As for Amazon Alexa and Google Home, these recordings are logged and stored in your account. You can listen to all of your previous requests made to your smart speaker. But what happens if someone gets unauthorized access to your account? The history of your voice recordings contains a significant piece of personal information. If someone with bad intentions gets their hands on it, you may end up in trouble.
Of course, just as with almost any type of digital history, you can delete your logged voice recordings to get peace of mind. Even if some unwanted snoopers break into your account, there will be no treasured data to steal.
Home assistants are not only always listening: they hear everything as well. What it means is that if you’re engaging in a conversation with your smart speaker, it records much more than you’re saying. Just think of all the background sounds out there. The ambient audio can reveal a great deal about you and your home environment.
A smart speaker can pick up sounds of your dog barking, your TV is turned on, or your family members talking to each other. All of these background sounds transform into additional data: now your virtual assistant may know what pets you have, what shows you watch, what sports you like, what time you’re at home on a typical day, who you live with, and even what their interests are.
It is rather hard to believe that companies ignore all of these sounds and just throw them away since the information these ambient audio snippets carry is extremely valuable in building users’ advertising profiles.
The convenience of a home assistant comes from its hands-free interaction; that is, a voice command is all it takes to get your virtual assistant to work. An AI-powered device is activated by the wake-up word, for example, “OK Google” for Google Assistant. However, smart speakers may also react to words that sound similar to their actual wake-up words: chances are, saying “OK doodle” will activate the Google device. It means that a home assistant may be triggered by accident, and from that moment it will be listening actively without you even knowing.
It hasn’t taken long for advertisers to make use of this technological imperfection. Burger King has released a TV ad designed to trigger Google Home with a relevant wake-up word. However, Google took action immediately and filtered the audio pattern used in the ad within hours so Google Home wouldn’t respond to it.
Home assistants aren’t capable of distinguishing their owner’s voice over other people yet. This brings a serious security concern: anyone can activate your smart speaker and give it voice commands. Depending on the apps you have connected to your home assistant, other people can access all kinds of your personal data. A neighbor who stopped by may get your bank balance or your schedule, and your kids can make pranks by setting random alarms or reminders.
However, the greatest risk comes from an online shopping functionality that is one of the key selling points of Amazon Echo. Goods can be ordered from Amazon with the help of voice commands only. A shopping feature in Amazon Echo comes enabled by default, but it can be disabled anytime. When shopping by voice is turned on, Alexa will ask you to say the four-digit code to confirm a purchase.
The 4-digit code is still not a foolproof security measure to prevent unintended purchases. There have been cases of kids ordering toys without their parents’ knowledge and even a parrot managing to order gift boxes without its owners’ knowledge.
What Can You Do to Improve Security: Useful Tips
While virtual home assistants still have room for improvement regarding their security, there are things you can do to enjoy the convenience of smart technology by minimizing the risks:
- Mute the home assistant when you’re not using it for a longer time – this will prevent it from always listening.
- Take up a good practice of deleting your voice recordings history regularly. Just keep in mind that it may hinder the learning of virtual assistants.
- Think twice before connecting sensitive accounts to your smart assistant. Connect only those apps that seem necessary and you want to control them by voice. Do you have more IoT devices in your home you’d wish to connect to your smart speaker? Follow the same rule. You really don’t want that saying “open the door” would be all it takes for a thief to intrude on your home when encountered a smart door lock.
- Secure your connected accounts with strong passwords and two-factor authentication where possible.
- Turn off the purchasing by voice feature if you don’t use it.
- Use a secure Wi-Fi network and create a separate one for guest access.
From time to time, SensorsTechForum features guest articles by cybersecurity and infosec leaders and enthusiasts such as this post. The opinions expressed in these guest posts, however, are entirely those of the contributing author, and may not reflect those of SensorsTechForum.