Enlarge /. An Amazon Echo, especially the upper control buttons for volume, microphone off and Alexa action.
Smart assistant devices have often had privacy flaws, but are considered safe enough for most people. However, new research into vulnerabilities in Amazon's Alexa platform shows the importance of thinking about the personal information your smart assistant stores about you – and keeping it as small as possible.
Results released Thursday by security firm Check Point show that Alexa's web services had bugs that a hacker could exploit to capture a target's entire voice history, i.e. recorded audio interactions with Alexa. Amazon fixed the bugs, but the vulnerability could also have resulted in profile information, including home address and any "skills" or apps the user added for Alexa. An attacker could even have deleted an existing skill and installed a malicious one to get more data after the initial attack.
"Virtual assistants are something you just talk and respond to, and typically you don't have malicious scenarios or concerns on your mind," said Oded Vanunu, director of product vulnerability research at Check Point. "However, we found a number of vulnerabilities in Alexa's infrastructure configuration that would eventually allow a malicious attacker to gather information about users and even install new capabilities."
In order for an attacker to exploit the vulnerability, they must first convince targets to click a malicious link, a common attack scenario. Fundamental flaws in certain Amazon and Alexa subdomains, however, meant that an attacker could have created a real and normal-looking Amazon link to lure victims into exposed parts of Amazon's infrastructure. By strategically routing users to track.amazon.com – a vulnerable site not related to Alexa but used to track Amazon parcels – the attacker could have injected code that allowed them to access the Alexa infrastructure and send a special request along with the destination's cookies from the package tracking page to Skillsstore.amazon.com/app/secure/your-skills-page.
At that point, the platform would mistake the attacker for the legitimate user, and the hacker could then access the victim's full audio history, list of installed skills, and other account details. The attacker can also uninstall a user-established skill and, if the hacker injected a malicious skill into the Alexa Skills Store, even install this interloping application on the victim's Alexa account.
Both Check Point and Amazon find that all skills in the Amazon store are being checked and monitored for potentially harmful behavior. Therefore, it cannot be taken for granted that an attacker could have used a malicious skill there at all. Check Point also suggests that a hacker may have access to the banking history through the attack. However, Amazon denies this on the grounds that information in Alexa's answers is being edited.
"The security of our devices is our top priority, and we value the work of independent researchers like Check Point who bring us potential problems," an Amazon spokesman told WIRED in a statement. "We resolved this issue soon after we were made aware of this, and we will continue to strengthen our systems. We are not aware of any instances of this vulnerability being exploited against our customers or any customer information disclosed."
Check Point's Vanunu says the attack he and his colleagues discovered was nuanced and that it's not surprising that, given the size of the company's platforms, Amazon didn't catch it on its own. However, the results provide a valuable reminder for users to think about the data they store in their various web accounts and minimize it as much as possible.
Not a case of "OK, come in!"
"This was definitely not an open door and & # 39; OK, come in! & # 39;" Says Vanunu. "This was a tricky attack, but we're glad Amazon took it seriously because the impact could have been bad on 200 million Alexa devices."
While you can't control whether Amazon has a bug in any of its remote web services, you can minimize the data on your Alexa account. After looking back at murky practices related to using human transcriptors for some Alexa users' audio snippets, Amazon made it easy to clear your audio history. It is important to do this regularly, otherwise Amazon will keep these recordings indefinitely.
To view and clear your Alexa history, open the Alexa app on your phone and go to Settings> History. In this view you can only delete entries individually. To delete en masse, go to Alexa Privacy Settings on the Amazon website and then select Review Voice History. You can also verbally delete by saying, "Alexa, delete what I just said" or "Alexa, delete everything I said today".
This story first appeared on wired.com.