What do Amazon Alexa, Google Voice and Apple Siri all have in common? They can all be hacked by a laser...From 110 Meters


Amazon Alexa Can Be Hacked By A Laser From 100 Meters — Is It Time To Hide Your Echo?

Thomas Brewster Nov 5, 2019, 08:34am
What do Amazon Alexa, Google Voice and Apple Siri all have in common? They can all be hacked by a laser.
That’s according to researchers who have discovered that when a laser is aimed at the devices’ microphones, an electrical signal is created, just as it is when a voice command is made. Using an oscilloscope, the academics found they could make it so the microphone created the same signal when receiving light as it did with sound. In doing so they effectively mimicked a voice with a laser beam.
These “light commands” can be made with cheap easy tech, even a classic laser pointer. And the commands can be tweaked to make Amazon, Google and Apple voice-operated tech carry out actions, such as opening doors, making online purchases or turning lights on and off. The attacks could even be used to unlock and start certain vehicles, the academics claimed.
As long as there aren’t any objects blocking the laser, the attacks can work from long distances, from one building to another, for instance. Windows won’t make a difference.
The researchers, from the University of Electro-Communications in Tokyo and the University of Michigan, were able to show off a light command asking Google Home what time it is from up to 110 meters away.
The basic vulnerability can’t be eradicated without a change in the design of the microsphones, the researchers said. They said they were working with Amazon, Apple and Google on some defenseive measures. A Google spokesperson said: “We are closely reviewing this research paper. Protecting our users is paramount, and we're always looking at ways to improve the security of our devices.”  An Amazon spokesperson added: “Customer trust is our top priority and we take customer security and the security of our products seriously. We are reviewing this research and continue to engage with the authors to understand more about their work.”
So what can users do? The most obvious defense is to remove your Amazon Echo, Google Home or whatever comparable tech you have from line of sight, said professor Alan Woodward, a security expert from the University of Surrey. “Or you could draw the curtains permanently. The former is a bit more practical,” he quipped.
“It’s just the sort of vulnerability that designers, even those with great threat models, don’t think about. It just goes to show that the threat can evolve and so should your threat model.”urning on speaker recognition features could also help, professor Woodward said, echoing what the researchers found. This will limit access to only legitimate users, who’ve registered their voices with the device. There’s a limit to that protection too, though, as the researchers noted: “Even if enabled, speaker recognition only verifies that the wake-up words ... are said in the owner’s voice, and not the rest of the command. This means that one ‘OK Google’ or ‘Alexa’ spoken by the owner can be used to compromise all the commands.”
There’s one more possible cause for concern: The research was funded by the Pentagon's research arm, the Defense Advanced Research Projects Agency (DARPA). It’s feasible then that such attacks could be a feature of powerful surveillance tools.

Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger