If you have not been regular clear your voice history with Amazon’s voice assistant, Alexa, you could have a good reason to get started: a recently fixed vulnerability that had exposed all your conversations with the smart speaker.
On Thursday, investigators from cybersecurity company Check Point released a report on security issues they discovered with Amazon’s Alexa, which would have allowed a potential hacker to get call logs from a person with the smart speaker, as well as skills installed on the device without the person knowing.
“The safety of our devices is a top priority, and we appreciate the work of independent researchers like Check Point who bring us potential issues. We have fixed this issue shortly after it came to our attention, and we will continue to keep our systems further strengthen, “an Amazon spokesman said in a statement.
The company said it was contacted by investigators in June and that it had not seen any cases of the vulnerability being used. But the concerns about security serve as a strong reminder to minimize the amount of history you register your smart speakers.
Connected devices at home present a new opening for hackers, and smart voice assistants are no different. Security researchers have often identified flaws with Alexa, such as a stranger screaming to unlock your door like a laser pointer that can activate your device 300 feet away.
Many of these concerns are alleviated by the fact that an attacker would be near your home or within reach of your speakers, but the security flaws found by Check Point would only require one click, researchers said.
Amazon had a vulnerability with its subdomains – URLs like track.amazon.com, for example. While you may be skeptical about clicking on suspicious links, a URL with the Amazon domain there may be enough to make you believe you are safe.
The security researchers discovered that they could insert code into the subdomain allowing them to extract a security token tied to your Alexa account. Using that token could pose a potential attacker when you install skills, get a list of the skills you already use and view your chat chat history with Alexa.
Depending on how sensitive your conversations with Alexa are, this could mean access to your health information, your finances, or just the stupid day-to-day things you might ask the voice assistant.
“Smart speakers and virtual assistants are so common that it’s easy to see how much personal data they have, and their role in controlling other smart devices in our homes,” said Oded Vanunu, Head of Product Vulnerability Control. a statement. “But hackers see them as entry points into the lives of peoples, giving them the opportunity to obtain data, wait for conversations or perform other malicious actions without the owner being aware of them. We conducted this research to highlighting how secure these devices are for maintaining users’ privacy. ”
Check Point said attackers could start a conversation by snooping by installing a skill, but Amazon scans skill for all malicious activity, blocking it from its marketplace. The voice history log is a major concern, and the vulnerability is a reminder that you should delete your conversations with Alexa on a regular basis.
Like other voice assistant providers, Amazon keeps track of your voice history to improve their own artificial intelligence, and unless you choose to do so, human reviewers will listen to those conversations as well.
You can set your voice history to delete automatically over the past three months or 18 months, but if you want to delete it every day or every week, you have to do it manually.
With vulnerabilities like this this is a good practice, because of the potential for hackers to gain access to these sensitive records. Ask yourself: Did the benefits of having a history of your conversations with Amazon outweigh the disadvantages?
While deleting your voting history may keep you safe from potential hackers, you may still have some privacy concerns about Amazon’s policies.
In a July 2019 letter to senators, Amazon said it would keep some transcripts of voice recordings indefinitely, even if the audio itself is deleted.