Phones are less susceptible to such attacks as they use voice authentication.
What you need to know
- Researchers have found that smart speakers such as Google Home and Amazon Echo can be hacked with the help of laser-powered "light commands."
- Apart from smart speakers, Facebook's Portal devices as well as smartphones can also be easily tricked by "light commands" from as far as a few hundred feet away.
- Researchers suggest smart speaker makers can fix this vulnerability by adding a light shield around the microphone or using two different microphones on opposite sides to listen to voice commands.
Researchers have discovered (via WIRED) that it is possible to "speak" to devices such as Google Home and Amazon Echo smart speakers with the help of "light commands." In order to do this, they had to point a laser at the target device's microphone using a telephoto lens and a tripod to change the intensity to a specific frequency. This would trick the device's voice assistant into responding to the light that hit the microphone's membrane as if it were sound. In some cases, simply flooding the light was enough to get it to respond to commands.
These "light commands" happen to be completely silent and can be transmitted from as far as 250 feet away. Researchers suggest hackers may even use an infrared laser, which isn't visible to the naked eye, to control your smart speakers.
Fortunately, there are quite a few limitations as well. First, attempting a laser-based attack would require specialized equipment, although most of them are easily available on Amazon and aren't very expensive either. A targeted device must also be directly in line of sight so that the laser can be aimed at the specific part of the microphone on the device.
However, it's not just smart speakers that are vulnerable to light commands. Smartphones, tablets, Facebook Portal, and other devices that use MEMS microphones and have a voice assistant were also found to be susceptible to such laser-based attacks.
According to the researchers, smart speaker makers can prevent such attacks by placing a light shield in front of the microphone and having two microphones on opposite sides to hear voice commands. In a statement sent to WIRED, both Google and Amazon have said that they are reviewing the research paper.
Tidak ada komentar:
Posting Komentar