[ad_1]
If there’s one thing consumers have learned in recent years, it’s that everything, even devices or institutions that we believe to be safe, can be hacked. With the explosion of Internet of Things (IoT) devices in millions of homes across the United States, there are many more opportunities for our personal information to be collected, shared, and stolen without our knowledge.
According to a report by Consumer Intelligence Research Partners, there are more than 50 million Amazon Echo devices installed in the US Alexa is the most ubiquitous digital assistant, but many users don’t know what happens to their data when they ask Alexa for them. read. a prescription or check your bank balance.
Amazon has repeatedly denied that Alexa-enabled devices are recording at all times, but the devices are always listening for the wake word (“Alexa” is one of several options) and will record and store what is said once Alexa is activated. . Recordings capture a fraction of a second before the trigger word is spoken and end when the user’s request is processed.
Hearing from smart assistants is still far from perfect, and the recent wave of random laughs from Alexa is a good example of voice commands gone wrong. Users reported that their Alexa-enabled devices spontaneously laughed, which Amazon attributed to phrases that sounded similar to “Alexa, laugh” even when users didn’t say the trigger word.
Does Alexa record private conversations?
There are words that sound similar enough to “Alexa” that the device may pick up snippets of private conversations that are not intended to be a command, said Pam Dixon, executive director of the Global Privacy Forum, a non-purpose public interest investigation. profit. group.
“The problem arises when people don’t realize that the recordings are stored until you delete them,” he says. “People should think about what they are asking their voice assistants and know that they can erase that information.”
According to an Amazon spokeswoman, Alexa recordings are stored securely in the Amazon Web Services cloud, and the company does not disclose customer information without a “valid and binding legal claim that we are duly satisfied.”
In fact, Amazon reportedly refused to comply with a court order to obtain data from an Echo that police in Bentonville, Arkansas, believed to be evidence in a murder case.
Dixon notes that Amazon’s privacy policy is “fairly transparent” and easily accessible on both desktop and mobile devices, but what will likely surprise users the most is how much of their own voices they will hear if they listen to what they have recorded. your Alexa-enabled devices. and stored.
“I really don’t think these devices are listening in and sending that data to third parties all the time, but looking at my own recordings, there was a lot more than I anticipated there,” Dixon said.
If you are concerned about the privacy and security of your personal information, there are steps you can take to protect your Alexa-enabled device.
Protect your home network
The devices in your smart home are only as secure as the network to which they connect. Start by changing the default name and password of your wireless network (do not include identifying information in either) and enable the Wi-Fi Protected Access II (WPA2) protocol on your router.
If possible, create one Wi-Fi network for your smart home devices and one for the devices you use for banking, shopping, or browsing, and set up a firewall to restrict what (and who) can connect to. Periodically check and install firmware updates on all your devices, including Alexa devices.
Change the Alexa wake word
The first step that Dixon recommends that users take on their Alexa-enabled devices is to change the word that triggers the recording.
For now, you can use “Amazon”, “Computer” or “Echo” instead. Choose the word that you are least likely to use in everyday conversation, so Alexa will record only if you speak directly to it.
It’s important to remember that Alexa-enabled devices can pick up the voices of strangers through closed doors and windows, Dixon adds. You can also turn off the device’s microphone to prevent it from listening completely.
Strengthen your Amazon password
Anyone with access to your Amazon account can listen to, share, or delete your Alexa voice recording history in the Manage Your Content and Devices panel. This includes family members requesting items with the same username, but your information can also be vulnerable to hackers who obtain your password from Amazon.
The commands you give Alexa – arm your security system, request directions and travel times, or call friends – can provide malicious actors with valuable information about your daily routine, which can put your personal safety, that of you, at risk. your home and your family. . Just like you would any other login, follow good password hygiene recommendations.
Delete old Alexa recordings, especially those with sensitive information
While asking Alexa to set a timer or play cat noises is fairly innocuous, saved recordings that include sensitive health, legal or financial information are less so.
Dixon says that most users don’t think about the consequences of having their conversations or requests stored indefinitely where others can access them. Recordings can resurface in divorce or child custody cases, for example.
“If you have any kind of questions or concerns about having recordings, just delete them,” Dixon said. “The idea is like clearing your web history in a web browser.”
To listen to and delete stored recordings, open Settings> History in the Alexa app, or use the dashboard at Amazon.com. Your recordings will stay in the Amazon cloud forever until you delete them.
Here’s how to delete recent Alexa recordings, automatically delete oldest recordings, or delete all Alexa recordings.
Deleting all old recordings can slightly degrade Alexa’s performance because the device uses your history to improve responses over time, and you will have to relearn patterns if information is lost.
If you don’t want to mass delete all your local weather recordings or music requests, you can selectively delete more sensitive material. You can also make sure your Alexa device isn’t being used to test new features, which Amazon employees or contractors are more likely to hear about.
Read the third-party Alexa skills privacy policies
Third-party Alexa skills, of which there are tens of thousands, can also collect personal information from users. Amazon requires developers of these skills to provide links to their privacy policies on the skill detail pages, but consumers are responsible for examining each of them to understand how data is collected, used, and stored.
“Everyone is very concerned about having some kind of data leak or leak on these home devices,” says Dixon. “No company right now wants to have a privacy problem; caution is the synonym in the future.”