Google Home and Amazon Echo can be hacked with lasers | Antivirus and Security

Okay, it’s unlikely that anyone will have so much free time to do this with you, but here’s the fact: researchers from the United States and Japan have discovered that you can control smart speakers like Google Home and Amazon Echo, by means of a laser. The trick works silently and remotely, unlocking locks, shopping and opening car doors.

Amazon Echo Plus

Security researcher Takeshi Sugawara, from the University of Electro-Communications in Tokyo, and Professor Kevin Fu, from the University of Michigan, demonstrated the sending of voice commands (?) By means of a laser. By modifying the intensity of the laser light, simulating a sine wave, it is possible to vibrate the membrane of the microphone of an intelligent speaker, imitating the speech of a human.

This is because, when hit by a laser at specific frequencies, the microphone interprets the light as if it were sound – only without making any noise. During the experiments, a 60-milliwatt laser managed to command 16 devices, including smartphones, smart speakers and other gadgets with voice support, at different distances.

Cell phones were the most difficult to control: two Android smartphones only interpreted the laser as a voice command from five meters away, while an iPhone can be attacked from ten meters. Smart speakers, on the other hand, received the commands 50 meters from the source. They also tested a smaller laser, 5 milliwatts, to control a Google Home inside a building 76 meters away (!).

It is not so simple to explore the fault from a distance because it would be necessary to aim the laser precisely. At the Laser Commands website, the name given to the vulnerability in the microphones, the researchers explain: “To focus the laser over long distances, a commercially available telephoto lens can be used. The aim can be done with a tripod, which significantly increases accuracy. An attacker could use a telescope or binoculars to view the device’s microphone inputs over long distances. ”

Laser attack on smart devices

The failure could be mitigated if manufacturers redesigned the devices. The researchers suggest using multiple microphones to capture audio from the environment; thus, if only one of the microphones receives the command, it is likely to be a laser attack. In addition, a non-transparent cover over the microphone could lessen the risk – even so, as the attacker can simply increase the laser power, this would be just a stopgap.

Google and Amazon are already reviewing the study. Facebook and Apple did not comment.

Leave a Comment