Smart assistant devices have had their share of privacy breaches, but they are generally considered safe enough for most people. New research into vulnerabilities in Amazon’s Alexa platform, however, highlights the importance of thinking about the personal data your smart assistant keeps over you – and minimizing it as little as possible.
Findings published on Thursday by security company Check Point show that Alexa’s web services had bugs that could have been used by a hacker to capture the entire voice history of a target, meaning their recorded audio interactions with Alexa. Amazon has patched the flaws, but the vulnerability could also have provided profile information, including home address, as well as any ‘skills’, like apps, that the user had added for Alexa. An attacker could even delete an existing skill and install a malicious one to grab more data after the initial attack.
“Virtual assistants are something you just talk to and answer with, and you do not normally have any kind of malicious scenarios or worries,” says Oded Vanunu, head of Check Point’s product vulnerability research. “But we’ve found a chain of vulnerabilities in Alexa’s infrastructure configuration that could eventually allow a malicious attacker to gather information about users and even install new skills.”
For an attacker to exploit the vulnerabilities, they would first have to trick targets into clicking on a malicious link, a common attack scenario. Underlying flaws in certain subdomains of Amazon and Alexa, however, mean that an attacker could have created a real and normal-looking Amazon switch to lure victims into exposed parts of Amazon’s infrastructure. By strategically directing users to track.amazon.com – a vulnerable page that is not related to Alexa but used to track Amazon packages – the attacker could have entered code that would allow them to turn to Alexa infrastructure, send a special request along with the cookies of the purpose of the page for tracking packages to skillsstore.amazon.com/app/secure/your-skills-page.
At this point, the platform would make the attacker malicious to the rightful user, and the hacker could then gain access to the victim’s full audio history, installed skill list, and other account details. The attacker could also remove a skill set by the user and, if the hacker had planted a malicious skill in the Alexa Skills Store, he could even install this interception application on the victim’s Alexa account.
Both Check Point and Amazon note that all skills in the Amazon store are screened and checked for potentially harmful behavior, so it is not a foregone conclusion that an attacker could have planted a malicious skill in the first place. Check Point also suggests that a hacker may be able to gain access to bank data through the attack, but Amazon denies this, saying that information is redacted in Alexa’s responses.
“The safety of our devices is a top priority, and we appreciate the work of independent researchers such as Check Point who are bringing potential issues to us,” an Amazon spokesman told WIRED in a statement. “We have fixed this issue as soon as it was brought to our attention, and we will continue to strengthen our systems. We are not aware that cases of this vulnerability will be used against our customers or any customer information that is exposed. . “
Vanunu of Check Point says that the attack he and his colleagues discovered was nuanced, and that it’s not surprising that Amazon did not take it upon themselves, given the scale of the company’s platforms. But the findings provide a valuable reminder for users to think about the data they store in their various Web accounts and minimize it as much as possible.
“This was definitely not a case of an open door and OK, come in!” Vanunu says. “This was a tricky attack, but we’m glad Amazon took it seriously, because the implications could be bad with 200 million Alexa devices out there.”
While you can not check if Amazon has a bug in one of its remote web services, you to be able to minimize data on your Alexa account. After blowbacks over heinous practices related to the use of human transcripts for audio snippets from Alexa users, Amazon made it easier to delete your audio history. It is important to do this regularly, as otherwise Amazon will store those recordings indefinitely.
To view and delete your Alexa history, open the Alexa app on your phone and go to Settings> History. In this view, you can only delete entries one by one. To delete en masse, go to Alexa Privacy Settings on Amazon’s Website and select Review Voice History. You can also verbally delete by saying, “Alexa, delete what I just said” or “Alexa, delete everything I said today.”
This story first appeared on wired.com.