in

Light Commands: Hackers can laser-assist voice assistants

Light Commands: Hacker können Sprach­as­sis­tenten per Laser bedienen

Scientists at the University of Michigan and Tokyo's University of Electro-Communications have published research demonstrating how attackers can use a laser to control digital voice assistant microphones several meters away to mimic voice commands.

Laser elicits signals from the microphones

Affected by this problem are basically all digital assistants with voice control in both smartphones and smart speakers, since the resulting problem arises from the fact that microphones with lasers a voice signal can be faked. The voice commands of the user are usually converted by the microphones into electrical signals. These electrical signals can also be triggered by a directed to a microphone laser with different intensity. Through a fixed sequence of these intensities, voice commands can be mimicked, which are then executed by the digital assistants, since they can not test the nature of the signal being generated.

Attack from up to 110 meters and through windows

The attack by laser, through which, for example, commands for opening doors, windows or garages could be triggered, but in addition makes a brute force attack on the PIN required for this necessary, also works through discs and in laboratory conditions over a distance of up to 110 meters, although longer distances should be possible but not tested. Depending on the hardware, the laser must have a different power so that the microphones respond as desired. For example, if the Samsung Galaxy S9 is 60 milliwatts, you only need 0.5 milliwatts for the Google Home.

Practical meaning severely limited

In practice, however, an attack via a laser on the microphones has limits. On the one hand, the attacker must be aware of the exact device to use appropriate laser, on the other hand, the microphones must also be able to be taken from the outside with the laser.

Amazon, Apple and Google could react

At least for smart speakers, which typically have multiple microphones on top, vendors like Amazon, Apple, and Google could respond to such an attack scenario relatively easily by checking each request for it to be detected by multiple microphones. If the signal comes from a microphone only, as in the scientists' experiments, the design could simply be denied. Amazon and Google want to examine the research therefore, so WiredHowever, Apple has not commented on the research so far. However, researchers say they are working with Amazon, Google, and even Apple to protect the technology from such an attack scenario.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *

The fingerprint lock feature of WhatsApp comes to Android

The fingerprint lock feature of WhatsApp comes to Android

Smart-Home-Statistik: Licht und Wetter­stationen in Deutschland vorn

Smart Home Statistics: Light and weather stations in Germany ahead